twitter |

Date : vendredi 12 octobre 2012 16h15
Durée : 45 minutes

Enterprises are faced with ever increasing amounts of data, and problems related to volume and usage demands have forced IT managers and developers to seek out new solutions. Fortunately, this has resulted in an explosion of innovation in massively parallel processing and non-relational data storage. Apache Hadoop, an open source software platform, has quickly become the technology of choice for large organizations in need of sophisticated analysis and transformation of petabytes of structured and complex data.

Cédric Carbone will discuss how users can access, manipulate and store huge volumes of data in Hadoop and benefit from high-performance, cost-optimized data integration with ultimate scalability.

During this session, attendees will learn how to:
• Leverage the explosive growth of data
• Deploy and tap into the powerful architecture of Hadoop thanks to an Eclipse based graphical interface
• Process massive data volumes through a combination of Hadoop and open source architecture

Bio: Cédric Carbone is Talend's Chief Technical Officer since its inception, where he manages the big data, data integration, data quality, MDM, and ESB products lines with an international team of more than 140 R&D engineers. He's also a Board Member at the Eclipse Foundation and OW2 consortium.