Saturday 23 November 2024
 
»
 
»
Story

Intel unveils new ‘big data’ offering

California, February 27, 2013

US-based Intel Corporation has unveiled Intel Distribution for Apache Hadoop software, which enables more organisations and the public to use the vast amounts of data being generated, collected and stored everyday – also known as “big data”.

The offering, which includes Intel Manager for Apache Hadoop software, is built from the silicon up to deliver industry-leading performance and improved security features, a statement from the company said.   

The ability to analyse and make sense of big data has profound potential to transform society by enabling new scientific discoveries, business models and consumer experiences. Yet, only a small fraction of the world is able to extract meaning from all of this information because the technologies, techniques and skills available today are either too rigid for the data types or too expensive to deploy.

Hadoop is an open source framework for storing and processing large volumes of diverse data on a scalable cluster of servers that has emerged as the preferred platform for managing big data, the statement said. With even more information coming from billions of sensors and intelligent systems also on the horizon, the framework must remain open and scalable as well as deliver on the demanding requirements of enterprise-grade performance, security and manageability.

“People and machines are producing valuable information that could enrich our lives in so many ways, from pinpoint accuracy in predicting severe weather to developing customised treatments for terminal diseases,” said Boyd Davis, vice president and general manager of Intel’s datacenter software division.

“Intel is committed to contributing its enhancements made to use all of the computing horsepower available to the open source community to provide the industry with a better foundation from which it can push the limits of innovation and realise the transformational opportunity of big data.”
       
By incorporating silicon-based encryption support of the Hadoop Distributed File System, organizations can now more securely analyze their data sets without compromising performance, the statement said.    

The optimisations made for the networking and IO technologies in the Intel Xeon processor platform also enable new levels of analytic performance. Analysing one terabyte of data, which would previously take more than four hours to fully process, can now be done in seven minutes thanks to the data-crunching combination of Intel’s hardware and Intel Distribution.

Considering Intel estimates that the world generates 1 petabyte (1,000 terabytes) of data every 11 seconds or the equivalent of 13 years of HD video, the power of Intel technology opens up the world to even greater possibilities, it said.

For example, in a hospital setting, the intelligence derived from this data could help improve patient care by helping caregivers make quicker and more accurate diagnoses, determine effectiveness of drugs, drug interactions, dosage recommendations and potential side effects through the analysis of millions of electronic medical records, public health data and claims records, it said. Strict guidelines also exist globally for protecting health and payment information, making it imperative to maintain security and privacy while performing analytics.  

The addition of the Intel Manager for Apache Hadoop software simplifies the deployment, configuration and monitoring of the cluster for system administrators as they look to deploy new applications.

Using the Intel Active Tuner for Apache Hadoop software optimal performance is automatically configured to take the guesswork out of performance tuning. Until now, this required a specialised understanding of each application’s use of system resources along with the Hadoop configuration and performance benchmarks, it said.

Intel is working with strategic partners to integrate this software into a number of next-generation platforms and solutions, and to enable deployment in public and private cloud environments.
       
The new software offering expands Intel’s extensive portfolio of datacenter computing, networking, storage and intelligent systems products. The recently introduced Intel Intelligent Systems Framework, a set of interoperable solutions designed to enable connectivity, manageability and security across intelligent devices in a consistent and scalable manner, sets the foundation to help to gather, analyse and deliver valuable information for end-to-end analytics from the device to the datacenter.

Additionally, Intel continues to invest in research and capital to advance the big data ecosystem. Intel Labs is at the forefront of advanced analytics research which includes the development of Intel Graph Builder for Apache Hadoop software, a library to construct large data sets into graphs to help visualize relationships between data.

Intel Graph Builder is optimised for the Intel Distribution to help reduce development time by eliminating the need to develop large amounts of custom code. Meanwhile, Intel Capital has been making major investments in disruptive big data analytics technologies including MongoDB company 10gen and big data analytics solution provider Guavus Analytics. – TradeArabia News Service




Tags: US | California | Intel Corporation |

More IT & Telecommunications Stories

calendarCalendar of Events

Ads