Hortonworks brings Hadoop to Windows
Hortonworks is bringing the popular open-source Apache Hadoop data processing platform to Microsoft shops.
Hortonworks is bringing the popular open-source Apache Hadoop data processing platform to Microsoft shops.
A new business version of Evernote gives IT managers control over a piece of cloud-based software commonly installed by their employees.
DARPA (the U.S. Defense Advanced Research Projects Agency) has awarded $3 million to software provider Continuum Analytics to help fund the development of Python's data processing and visualization capabilities for big data jobs.
In an interview, the company's CEO and senior director of products discuss the mobile possibilities of their offering and defend Java's security
InfoWorld's 2013 Technology of the Year Award winners stretch from devices and desktops to data centers and beyond
Users of Amazon Web Services will soon be able to orchestrate workflows across different AWS services and their own internal resources, using a new orchestration engine called the AWS Data Pipeline.
With the commercial release of version 5 of its self-named reporting and analysis suite, Jaspersoft has revamped the software's visualization engine, doing away with an Adobe Flash-based visualization engine in favor of one using HTML5 Web standards.
Facebook has beaten some of the limitations of the Apache Hadoop data processing platform, its engineers assert.
At IBM's Information On Demand and Business Analytics Forum, being held this week in Las Vegas, the company announced a number of new add-ons and services designed to help organizations analyze their expanding data sets more quickly.
The proliferation of large-scale data sets is just beginning to change business and science around the world, but enterprises need to prepare in order to gain the most advantage from their information, panelists said at a Silicon Valley event this week.
The Greenplum division of EMC is building a single data analytics platform that can crunch both structured and unstructured data and give a broad range of users the tools to study an enterprise's information.
Everyone is a trend watcher. But at a certain point, to determine which trends will actually weave their way into the fabric of business computing, you need to first take a hard look at the technologies that gave life to the latest buzz phrases.
For the last few years, the world of NoSQL databases has been filled with exciting new projects, ambitious claims, and plenty of chest beating. The hypesters said the new NoSQL software packages offered tremendous performance gains by tossing away all of the structure and paranoid triple-checking that database creators had lovingly added over the years. Reliability? It's overrated, said the new programmers who didn't run serious business applications for Wall Street banks but trafficked in trivial, forgettable data about people's lives. Tabular structure? It's too hidebound and limiting. If we ignore these things, our databases will be free and insanely fast.
Year after year, the cost of disk space has plummeted. Since you can pick up a terabyte for $50, it's often seemed a false economy to be careful with storage.
Informatica has strengthened its hand in the burgeoning market for Hadoop, the open-source programming framework for large-scale data processing, unveiling a new data parser on Wednesday that can transform piles of unstructured information into a more structured form for use in running Hadoop jobs.