You're viewing archived Accumulo Summit 2014 content —
jump to current year!
Addressing Big Data Challenges Through Innovative Architecture, Databases, and Software
Back to Schedule
Design
Abstract
The ability to collect and analyze large amounts of data is a growing problem within the scientific community. The growing gap between data and users calls for innovative tools that address the challenges faced by big data volume, velocity and variety. The Massachusetts Institute of Technology’s Lincoln Laboratory has taken a leading role in developing a set of tools to address these challenges. Big data volume stresses the storage, memory, and compute capacity of a computing system and requires access to a computing cloud. The velocity of big data stresses the rate at which data can be absorbed and meaningful answers produced and can be addressed by the NSA led Common Big Data Architecture (CBDA). Big data variety may present both the largest challenge and the greatest set of opportunities. The promise of big data is the ability to correlate diverse and heterogeneous data to generate new insights. The centerpiece of the CBDA is the NSA-developed Apache Accumulo database (capable of millions of entries per second) and the Lincoln Laboratory-developed D4M schema. This talk will concentrate on how we utilize innovative technologies in our mission to apply advanced technology to problems of national security.
Speakers
Vijay Gadepally
Vijay Gadepally is a member of the technical staff at the Massachusetts Institute of Technology Lincoln Laboratory in the Computing and Analytics team. Vijay’s research is in the area of high performance computing, big data systems, analytics, and advanced database technologies. He holds a Ph.D. in Electrical and Computer Engineering from The Ohio State University and a B.Tech degree in Electrical Engineering from the Indian Institute of Technology, Kanpur. Vijay has also worked at Raytheon Company and Rensselaer Polytechnic Institute.