Wednesday 10 August 2011

"Great knowledge" Analysis of Major Challenges to Traditional Approaches

Terabytes or petabytes, quickly became a reference point for data warehousing business, many companies are obsessed with the volume and diversity of "large volumes of data." However, usually the storage of large volumes of structured and unstructured materials, analysis of the problem is ignored - that goes with it is usable, real-time business intelligence (BI) to convert the raw data, smarter decisions.

Traditional relational data warehouse and business intelligence initiatives, and many planned to add some data available on a regular basis and supply of mature products to benefit from a well-defined applications. His intention is that companies users can slice and dice data and general information about what is happening and to allow you to create reports to answer questions that the. For example, "What is the revenue for a specific region for a given time?"

Open the door for companies to explore the information models, many of these contracts to shake the established idea of ​​applying data analysis, might not have thought otherwise; ask questions, and finally the creation of strategies to provide a competitive advantage.

"Over the last 20 to 25 years, companies can take up to 5%, perhaps focused on the information presented to them," Brian Hopkins, of Cambridge, Massachusetts, principal analyst at Forrester Research Inc. "All we said, did not know what to do with the fall on the floor and fell into space. better able to compete for companies seeking to penetrate 95% of other data is swimming around them; they can do better than anyone."

Struggling to keep up with the traditional RDBMS

Aggregates, unlike the traditional relational database (RDBMS) with the efforts of structural data processing, large data warehouses with data feeds from various sources, including hard flooded social networking sites like Facebook and Twitter, blogs, daily activity, information from Global Positioning System location bar code readers and RFID readers to talk about traffic and the machine creates the data produced by the sensors.

Stamford, Conn. Gartner Inc, a growing information estimated the overall annual volume of 59% and the volume of data and to cope with the diversity around the block when there are difficulties in the speed or how fast data can be processed in connection with large companies to offer certain advantages.

Data management of this concept, both economically and to manage the volume and speed requirements are not appropriate for the so-called traditional practices of large data storage systems, and there is excessive pressure on data BI in terms of performance.

"To achieve a more precise analysis of the new rates, or how to use the detailed analysis of the data have a paradigm shift in terms of" David Menninger, Ventana Research, Inc., San vice president and research director specializing in the management data strategies Ramon, Calif., consulting firm.

Large-scale data analysis, more data processing

Key paradigm for the analysis of volume, variety and speed of data on a massive range of new technologies designed to meet the challenges. At the heart of the new movement of massive parallel processing (MMP) will automatically get an important commodity to collect more than one server, and parallel to the workload of database technology divides the database performance, while working with very large data sets.

This basic architecture, while accelerating the speed of a user can ask questions, rather than the line used to reduce the amount of storage space in the columns are the columns of data and results database to store data. Database analysis, designed for large workloads and analysis of BI, companies to respond to requests faster and more efficient and economically gain access to large amounts of data are evaluated for a technology another. The same with real devices for data storage and server technology, storage and data management practices for the needs of large databases in a series of integrated packaging.

These skills tend to play outside of the Core RDBMS technology, however, for a significant amount of attention when it comes to the analysis of large data are relatively new. Community Hadoop open-source platform, especially as data, text, video, social media publications and manage large volumes of unstructured data streams is one of the most widely reported new measures. Distributed clusters with Hadoop file system based computers, data warehouse and data on the treatment of primary progressive Cassandra large data sets, including the MapReduce framework, which develops new technologies, is the beginning of the shell and other projects.

Draws the attention of large volumes of data used in Hadoop

More than 163 participants (54%) more than half are still large volumes of data-based initiatives as part of Hadoop to use or evaluation of the "Management Hadoop and information," according to a new report from the data Ventana Research benchmark from a large number of previously could not have done more desire to analyze data and analysis. However, this first wave of enthusiasm, despite the tools relatively immature and real-time analytics Hadoop only a handful of packages designed to protect users from the complexity of the source technology is available today to open it.

New technologies for data storage professionals is a challenge, however, the analysis of large data move to a real problem for companies that want to do more important for the ability to move or risk losing the business high volume.

"Skin that has been around for a long time, but rather much more," he Yvonne Genovese, vice president and analyst at Gartner, said. Information and knowledge to turn data into action and the directors of data storage can really make his mark, he said.

No comments:

Post a Comment