2015 Big Data and Manufacturing

De Wiki de Ilan Chamovitz - Informatica, Administracao, Eng. Producao, Educacao, Tecnologia da Informacao

What is involved, when we talk about BigData:

Retrieved from http://inside-bigdata.com/ (by Kevin Normandeau)

Volume

Big data implies enormous volumes of data. It used to be employees created data. Now that data is generated by machines, networks and human interaction on systems like social media the volume of data to be analyzed is massive. Yet, Inderpal states that the volume of data is not as much the problem as other V’s like veracity.

Variety

Variety refers to the many sources and types of data both structured and unstructured. We used to store data from sources like spreadsheets and databases. Now data comes in the form of emails, photos, videos, monitoring devices, PDFs, audio, etc. This variety of unstructured data creates problems for storage, mining and analyzing data. Jeff Veis, VP Solutions at HP Autonomy presented how HP is helping organizations deal with big challenges including data variety.

Velocity

Big Data Velocity deals with the pace at which data flows in from sources like business processes, machines, networks and human interaction with things like social media sites, mobile devices, etc. The flow of data is massive and continuous. This real-time data can help researchers and businesses make valuable decisions that provide strategic competitive advantages and ROI if you are able to handle the velocity. Inderpal suggest that sampling data can help deal with issues like volume and velocity.

Veracity

Big Data Veracity refers to the biases, noise and abnormality in data. Is the data that is being stored, and mined meaningful to the problem being analyzed. Inderpal feel veracity in data analysis is the biggest challenge when compares to things like volume and velocity. In scoping out your big data strategy you need to have your team and partners work to help keep your data clean and processes to keep ‘dirty data’ from accumulating in your systems.

Validity

Like big data veracity is the issue of validity meaning is the data correct and accurate for the intended use. Clearly valid data is key to making the right decisions. Phil Francisco, VP of Product Management from IBM spoke about IBM’s big data strategy and tools they offer to help with data veracity and validity.

Volatility

Big data volatility refers to how long is data valid and how long should it be stored. In this world of real time data you need to determine at what point is data no longer relevant to the current analysis.


Big Data people and blogs - http://www.datasciencecentral.com/profiles/blogs/top-data-science-bloggers


Big Data Trends 
13 New Trends in Big Data and Data Science     Posted by Vincent Granville on November 11, 2014 at 10:30am
   View Blog - http://www.datasciencecentral.com/profiles/blogs/13-new-trends-in-big-data-and-data-science

- Based on requests from clients - vendors of data processing platforms and products - as well as trends in popular blogs, job postings, and my own reading. Here are a few topics recently gaining strong traction (items beyond #13 were recently added)::

- The rise of data plumbing, to make big data run smoothly, safely, reliably, and fast through all "data pipes" (Internet, Intranet, in-memory, local servers, cloud, Hadoop clusters etc.), optimizing redundancy, load balance, data caching, data storage, data compression, signal extraction, data summarization and more. We bought the domain name DataPlumbing.com last week.

- The rise of the data plumber, system architect, and system analyst (a new breed of engineers and data scientists), a direct result of the rise of data plumbing

- Use of data science in unusual fields such as astrophysics, and the other way around (data science integrating techniques from these fields)

- The death of the fake data scientist

- The rise of the right-sized data (as oppose to big data). Other keywords related to this trend is "light analytics", big data diet", "data outsourcing", the re-birth of "small data". Not that big data is going away, it is indeed getting bigger every second, but many businesses are trying to leverage an increasingly smaller portion of it, rather than being lost in a (costly) ocean of unexploited data.

- Putting more intelligence (sometimes called AI or deep learning) into rudimentary big data applications (currently lacking any true statistical science) such as recommendation engines, crowdsourcing or collaborative filtering. Purpose: detecting and eliminating spam, fake profiles, fake traffic, propaganda, attacks, scams, bad recommendations and other abuses, as early as possible.

- Increased awareness of data security and protection, against computer or business hackers.

- The rise of mobile data exploitation. For instance processing billions of text messages to detect the spread of a disease or other global risks, to help design alarm systems or market the right product in real-time (via opt-in, user-customized text messages) to a walking customer in a shopping mall. Not sure that even the NSA is capable of doing it as of today. The issue is more about capturing and reacting to the right signal, rather than absorbing/digesting big data. Another trend is optimization of revenue from mobile apps, leveraging mobile app dashboards.

- The rise of the "automated statistician", in short, automated, scalable, robust analytic solutions fit for batch processing, real-time, machine-to-machine communications, and black-box analytics used by non-experts. More on this in our upcoming book, entitled data science 2.0.

- Predictive modeling without models. Operations research and mathematicians contributing to the science of predicting, bringing mathematical optimization and simulation as an alternative to delicate and mysterious statistical models.

- High performance computing (HPC) which could revolutionize the way algorithms are designed.

- Increased collaboration between government agencies worldwide to standardize data and share it, for intelligence purposes. Imagine the census bureau sharing data with the IRS. Or banks in US sharing data with security agencies in Switzerland.

- Forecasting space weather (best time / best location lo land on Mars), and natural events on Earth (volcanoes, Earthquakes, undersea weather patterns and implications to humans, when will Earth's magnetic field flip).

- Use of data science for automated content generation (including content aggregation and classification); for automated correction of student essays; data science used in court to strengthen the level of evidence - or lack of - against a defendant; for plagiarism detection; for car traffic optimization and to compute optimum routes; for identifying, selecting and keeping ideal employees; for automated IRS audits sent to taxpayers to avoid costly litigation and time wasting; for urban planning; for precision agriculture

- Measuring yield of big data or data science initiatives (that is, benefit after software and HR costs, over baseline)

- Digital health: diagnostic/treatment offered by a robot (artificial intelligence, decision trees) and/or remote doctors; digital law: same thing, with attorneys replaced by robots, at least for mundane cases or tasks. Even lawyers and doctors could have their jobs replaced by robots! This assumes that a lot of medical or legal data gets centralized, processed and made well structured for easy querying, updating and retrieval by (automated) deep learning systems.

- Analytic processes (even in batch mode) accessible from your browser anywhere on any device. Growth of analytics apps and APIs.

Deep Learning

BigData and Manufacturing

Manufacturing News - Website - Advanced Machining Showcases New Facility and Technology December 1, 2013 http://www.mfgnewsweb.com/archives/3/7576/General-dec13/AdvancedMachiningShowcasesNewFacilityandTechnology.aspx


Article How big data can improve manufacturing Manufacturers taking advantage of advanced analytics can reduce process flaws, saving time and money. July 2014 | by Eric Auschitzky, Markus Hammer, and Agesan Rajagopaul

http://www.mckinsey.com/insights/operations/how_big_data_can_improve_manufacturing


When Big Data Meets Manufacturing by Stephen Chick, Serguei Netessine, INSEAD Professors of Technology and Operations Management and Arnd Huchzermeier, Chaired Professor of Production Management at WHU-Otto Beisheim School of Management | April 16, 2014 Read more at http://knowledge.insead.edu/operations-management/when-big-data-meets-manufacturing-3297#Gry1EJyPTSU8Syzx.99


Article Attitudes on How Big Data will Affect Manufacturing Performance [DATA] Posted by Greg Goodwin on Thu, Mar 06, 2014 @ 05:00 AM

http://blog.lnsresearch.com/blog/bid/194972/Attitudes-on-How-Big-Data-will-Affect-Manufacturing-Performance-DATA


4/2/2014 12:00 AM by Doug Henschen Merck Optimizes Manufacturing With Big Data Analytics Pharmaceutical firm uses Hadoop to crunch huge amounts of data so it can develop vaccines faster. One of eight profiles of InformationWeek Elite 100 Business Innovation Award winners.

http://www.informationweek.com/strategic-cio/executive-insights-and-innovation/merck-optimizes-manufacturing-with-big-data-analytics/d/d-id/1127901


11/28/2014 @ 9:40PM 11,943 views

Ten Ways Big Data Is Revolutionizing Manufacturing - Louis Columbus

http://www.forbes.com/sites/louiscolumbus/2014/11/28/ten-ways-big-data-is-revolutionizing-manufacturing/?utm_medium=twitter&utm_source=twitterfeed

In addition to the examples provided in the McKinsey article, there are ten ways big data is revolutionizing manufacturing:

1 Increasing the accuracy, quality and yield of biopharmaceutical production.

2 Accelerating the integration of IT, manufacturing and operational systems making the vision of Industrie 4.0 a reality.

3 Better forecasts of product demand and production (46%), understanding plant performance across multiple metrics (45%) and providing service and support to customers faster (39%) are the top three areas big data can improve manufacturing performance. 4 Integrating advanced analytics across the Six Sigma DMAIC (Define, Measure, Analyze, Improve and Control) framework to fuel continuous improvement.

5 Greater visibility into supplier quality levels, and greater accuracy in predicting supplier performance over time.

6 Measuring compliance and traceability to the machine level becomes possible.

7 Selling only the most profitable customized or build-to-order configurations of products that impact production the least.

8 Breaking quality management and compliance systems out of their silos and making them a corporate priority.

9 Quantify how daily production impacts financial performance with visibility to the machine level.

10 Service becomes strategic and a contributor to customers’ goals by monitoring products and proactively providing preventative maintenance recommendations.


Takeaways from the MIT/Accenture Big Data in Manufacturing Conference Posted by Greg Goodwin on Wed, Nov 27, 2013 @ 06:00 AM

http://blog.lnsresearch.com/blog/bid/190482/Takeaways-from-the-MIT-Accenture-Big-Data-in-Manufacturing-Conference

- Enabling New Business Models with Big Data Capabilities (Professor David Simchi-Levi )- (Rolls-Royce Aircraft Division Restructures its Business Model)

- Leveraging Big Data to Create Closed-Loop Environments and Continuous Improvement Models (Accenture is working on a "smart water" infrastructure network that is able to predict leaks with high accuracy before they occur.(by Narendra Mulani))

- Pioneering a Management Revolution (Amazon, decision also based "on data-minded individuals from all areas of the company who foresee ways in which new data collection can provide information that can improve the company in some way". (Brynjolfsson))

- The Pursuit of Performance Management Excellence

Ferramentas pessoais