BI and Analytics - News, Features, and Slideshows

News

  • Hadoop challenger works to add developers

    LexisNexis has worked for more than a decade to develop a large scale system for Big Data manipulation, and it believes that it has produced something that's better and more mature than the better known Hadoop technology.

  • Off to a Fast Start

    Feeling like your business intelligence efforts are a bit sluggish and out of touch with what the company needs? Maybe it's time to try agile BI, a rapid development methodology that solicits end-user input early and often and delivers BI systems fast.

  • IT must prepare for Hadoop security issues

    NEW YORK -- Corporate IT executives need to pay attention to numerous potential security issues before using Hadoop to aggregate data from multiple, disparate sources, analysts and IT executives said at the <a href="http://www.computerworld.com/s/article/9221636/Hadoop_ready_for_corporate_IT_execs_say">Hadoop World conference</a> here this week.

  • Hadoop ready for corporate IT, execs say

    NEW YORK -- Despite some lingering technology issues, <a href="http://www.computerworld.com/s/article/9221495/Q_A_Hadoop_creator_expects_surge_in_interest_to_continue">Hadoop</a> is ready for enterprise use, IT executives said Tuesday at the Hadoop World conference here.

  • Hadoop creator expects surge in interest to continue

    Doug Cutting , the creator of the open-source Hadoop framework that allows enterprises to store and analyze petabytes of unstructured data, led the team that built one of the world's largest Hadoop clusters while he was at Yahoo. The former engineer at Excite, Apple and Xerox PARC is also the developer of Lucene and Nutch, two open-source search engine technologies now being managed by the Apache Foundation. Cutting is now an architect at Cloudera, which sells and supports a commercial version of Hadoop and which this week will host the Hadoop World conference in New York. In an interview, Cutting talked about the reasons for the surging enterprise interest in Hadoop.

  • Big data goes mainstream

    We've all heard the predictions: By 2020, the quantity of electronically stored data will reach 35 trillion gigabytes, a forty-four-fold increase from 2009. We had already reached 1.2 million petabytes, or 1.2 zettabytes, by the end of 2010, according to IDC. That's enough data to fill a stack of DVDs reaching from the Earth to the moon and back -- about 240,000 miles each way.

  • Oracle does about-face on NoSQL

    SAN FRANCISCO -- Oracle's introduction of its <a href="http://www.computerworld.com/s/article/9220464/Oracle_rolls_out_Big_Data_appliance">Big Data Appliance</a> at the OpenWorld conference here this week is an indication of the attention it is being forced to pay to NoSQL database technology.

[]