Big Data and the Public Sector

Jun 27, 2012

In my May 30th blog, I talked about how Big Data discussions are starting to gain traction and build buzz across the federal government. Last week I had an opportunity to attend the FCW Federal Briefing on Big Data, which turned out to be a very good event. The packed room, consisting of government employees, decision makers, vendors, solution providers and academia had a lively discussion on some excellent topics. There were keynotes and panel discussions on Big Data by some very influential folks including Dr.Suzi Iacono (Sr. Science Advisor at NSF), Dr. George Strawn (Director at NCO and Co-Chair at NITRD), Dr. Chris Greer (Associate Director at NIST) and several others. This one day seminar was centered on the emerging challenges, trends, directions and solutions for Big Data and Data Analytics. The speakers provided a consistent understanding of the challenges and opportunities associated with Big Data, sharing some thought provoking nuggets of information.

Agency data is growing exponentially from a volume, velocity and variety perspective (both structured and unstructured data, including full motion video, emails, voice, social networks, sensor-enabled facilities, web and biometrics data). Agencies are also trying to figure out how they can best store, merge, manage and leverage this unbelievable amount of data and try to make it actionable.  And they are doing so at the same time they are trying to address the challenges posed by security and privacy that comes with the merging of federal data from various sources.

In order to harness the power of Big Data, agencies need to create a data ecosystem. They need to collaborate with academia and industry to leverage innovation and technological developments. The systems have to be architected keeping in mind security, interoperability and open standards from conception. Federal agencies have to work through processes, policies and politics and break down data silos. Advances in hardware and software that have the capability to process multi-structured and multi-sourced data and new kinds of scalable storage including solid-state memory and parallel architectures are required to make Big Data a reality. Dr. George Strawn made a remark that he has never before seen a technology evolve so fast and go from a “bag of tricks” to working applications. Dr. Strawn made it clear that the focus in government has transitioned from Open Government to Cloud Computing to Big Data, and that if we do not take this seriously and start producing real results in the near term, the focus might again shift to something new like “Cyber-physical Systems” next year.

I truly believe Big Data and Advanced Analytics are valuable to the agency mission. I am happy to see the collaboration between the agencies, standards bodies and industry finally taking shape to tackle Big Data problems. I am honored to be serving on the TechAmerica Foundation Big Data Commission as a commissioner and be able to play a key role in helping shape and provide guidance on how government should best leverage Big Data to address critical business issues and better deliver on the mission. It is now time to put on our thinking caps and start determining how to make the most of the opportunity presented to us by the challenge of Big Data. In the coming weeks and months, I will be blogging on many different aspects of Big Data. Let me know your thoughts. Follow me on Twitter at GTSI_Architect.

Add Pingback

Please add a comment

You must be logged in to leave a reply. Login »