MyPage is a personalized page based on your interests.The page is customized to help you to find content that matters you the most.


I'm not curious

Quintillion Bytes of Data Is No Problem With Big Data

Published on 06 June 13
424
0
0
Quintillion Bytes of Data Is No Problem With Big Data - Image 1

When we upload, download, store, update, develop new technologies to be used in our daily computing lives, all the information and data are stored on servers. Now imagine all important functions that are needed and used by the government, military, financial institutions, schools, public service, social security and other services â these institutions are keeping large quantity data to be used and stored to make our life easier. Every day all around the world, we create around 2.5 quintillion bytes of data and information. The quantity is so big that experts say 905 of the data in the world at present was created in the last two years! All these data came from digital pictures, videos, POS records, mobile GPS signals, social media posts and shout outs, climate information, news feeds, science and research, file sharing and uploading and other things that we do with our computers. This is what we call big data.


Big data, as the name implies is a collection of data sets that is very large and complex of unstructured and structured data that cannot be processed using traditional or on hand database management and traditional processing applications. The term when used by vendors usually refers to the technology that can handle large amounts of data and storage facilities. If ever you think of it, with millions of computer users, internet users and all this technology, how do we manage such large amounts of information. Big data sizes are constantly changing, from 2012 there were a few terabytes to petabytes of data in a single data set. Improvements in the traditional database management system (DBMS) and the introduction of new databases, the ability of handling larger amounts of data is in reach. There still some difficulty in implementing new tools thus new platforms is being developed to handle different types of large data.


Big Data Applied To Business


Analyst Doug Laney defined data growth as three dimensional but by 2012, the definition was updated to be four definitive characteristics of big data.

  1. Volume â any enterprise or large organizations are amassed with large quantities of data everyday reaching from terabytes to petabytes.
  2. Velocity and speed â speed is the key In making decisions that are based on solid results from information and data gathered.
  3. Variety â any type of data, it can be structured or unstructured.
  4. Veracity â there are times that decision makers donât trust the information they have to make decisions. Trusting a well researched backed by solid analytic will make decision making easier.

Business intelligence uses a process called descriptive statistics. It uses a huge density data to measure things and detect trends in the market. It also uses inductive statistics that use low density data whose large volume allowing regressions and giving big data enhance predictive capabilities. The potential of of the technology is massive. It is more than a matter of size. Experts and leaders stated that it is an opportunity to find new and emerging types of data content and making the business more flexible, agile and answer the challenges of the future.






Quintillion Bytes of Data Is No Problem With Big Data - Image 1

When we upload, download, store, update, develop new technologies to be used in our daily computing lives, all the information and data are stored on servers. Now imagine all important functions that are needed and used by the government, military, financial institutions, schools, public service, social security and other services â these institutions are keeping large quantity data to be used and stored to make our life easier. Every day all around the world, we create around 2.5 quintillion bytes of data and information. The quantity is so big that experts say 905 of the data in the world at present was created in the last two years! All these data came from digital pictures, videos, POS records, mobile GPS signals, social media posts and shout outs, climate information, news feeds, science and research, file sharing and uploading and other things that we do with our computers. This is what we call big data.

Big data, as the name implies is a collection of data sets that is very large and complex of unstructured and structured data that cannot be processed using traditional or on hand database management and traditional processing applications. The term when used by vendors usually refers to the technology that can handle large amounts of data and storage facilities. If ever you think of it, with millions of computer users, internet users and all this technology, how do we manage such large amounts of information. Big data sizes are constantly changing, from 2012 there were a few terabytes to petabytes of data in a single data set. Improvements in the traditional database management system (DBMS) and the introduction of new databases, the ability of handling larger amounts of data is in reach. There still some difficulty in implementing new tools thus new platforms is being developed to handle different types of large data.

Big Data Applied To Business

Analyst Doug Laney defined data growth as three dimensional but by 2012, the definition was updated to be four definitive characteristics of big data.

  1. Volume â any enterprise or large organizations are amassed with large quantities of data everyday reaching from terabytes to petabytes.
  2. Velocity and speed â speed is the key In making decisions that are based on solid results from information and data gathered.
  3. Variety â any type of data, it can be structured or unstructured.
  4. Veracity â there are times that decision makers donât trust the information they have to make decisions. Trusting a well researched backed by solid analytic will make decision making easier.
Business intelligence uses a process called descriptive statistics. It uses a huge density data to measure things and detect trends in the market. It also uses inductive statistics that use low density data whose large volume allowing regressions and giving big data enhance predictive capabilities. The potential of of the technology is massive. It is more than a matter of size. Experts and leaders stated that it is an opportunity to find new and emerging types of data content and making the business more flexible, agile and answer the challenges of the future.

This blog is listed under Data & Information Management Community

Related Posts:
Post a Comment

Please notify me the replies via email.

Important:
  • We hope the conversations that take place on MyTechLogy.com will be constructive and thought-provoking.
  • To ensure the quality of the discussion, our moderators may review/edit the comments for clarity and relevance.
  • Comments that are promotional, mean-spirited, or off-topic may be deleted per the moderators' judgment.
You may also be interested in
Awards & Accolades for MyTechLogy
Winner of
REDHERRING
Top 100 Asia
Finalist at SiTF Awards 2014 under the category Best Social & Community Product
Finalist at HR Vendor of the Year 2015 Awards under the category Best Learning Management System
Finalist at HR Vendor of the Year 2015 Awards under the category Best Talent Management Software
Hidden Image Url

Back to Top