CIMdata PLM Industry Summary Online Archive

18 April 2012

CIMdata News

Analytics, PLM Converge Amid Data Tsunami; A Ceaseless Search for Sustainable Advantage

The enormous amounts of digital data that businesses create, and the increasing rates of data’s accumulation, present a growing challenge as well as an opportunity to be seized. The opportunity is powerful new capabilities emerging in business analytics—smarter user interfaces and more powerful search engines that reach more broadly and deeply.

These advances have big implications for those who use product lifecycle management (PLM) strategies and enabling technologies to build sustainable competitive advantages, especially in any type of manufacturing enterprise. Engineers in manufacturing companies already spend half or more of their time searching for and validating information in massive computer databases.

For several reasons, including the tsunami of digital data atop the proliferation of new products, file duplication, and new file formats, the challenge will get worse before it gets better. The same is true, in the short term, for “connecting the dots” to develop insights for decision making amid fierce global competition.

Insights, of course, are the building blocks of good decisions. The ability of an enterprise to consistently make good decisions is the foundation of every sustainable competitive advantage and may even be the only one that matters. Business analytics is the technology, science, and art of extracting usable information from networks of computers and the databases they manage.

Also important in analytics is a core concept from product engineering and design, the single source of truth. At any stage in a new product’s development cycle, the single source of truth also applies to analytics. Bringing users unerringly to the single source of truth is the goal of product data management (PDM) and all other similar computerized systems for managing information.

http://images.connect2communities.com/articles/43364/pic2_43364.jpg


But those systems generally focus on day-to-day engineering processes at the business-unit level. That is not the case with PLM. PLM embraces and enables a myriad of processes and information sources scattered throughout the enterprise and a product’s life. This alone is enough to justify bringing business analytics into PLM strategies and environments.

Insightful decision making and searching out the single source of truth are just two of the ways that analytics goes far beyond search engines, browsers, and even the most sophisticated data mining. Sometimes confused with “business intelligence,” the terms are often used interchangeably. Many industry analysts point out that business intelligence gathers information and business analytics provides context but there is much more to it. Regardless of the label, these capabilities have long been a complement to PLM.

What’s new is that business analytics presents a rare opportunity to add on and even embed a new form of empowerment, as when analytics generates insights by finding connections that were previously overlooked or hidden.

From our standpoint at CIMdata, as the global leader in PLM research and management consulting, the convergence of analytics into PLM cannot happen too soon. Enterprises everywhere struggle amid the digital tsunami, a multiplicity of data formats, highly distributed data sources, compressed development cycles, smaller windows of opportunity, product proliferation, increased energy costs, closer attention from regulators, and unceasing competitive pressure to do more with less. Business as usual, in other words!

http://images.connect2communities.com/articles/43364/pic3_43364.jpg


Quantifying the digital tsunami is like counting molecules in a mountain. According to Cisco Systems Inc., San José, Calif., worldwide data center traffic reached 1.1 zettabytes in 2010. Numerically speaking, a zettabyte is a 10 followed by 21 zeroes. A gigabyte is a 10 followed by nine zeroes. This digital traffic will grow at a compound annual growth rate (CAGR) of 33 percent through 2015 to an annual rate 4.8 zettabytes. Of that, Cisco experts calculate that traffic from cloud services will grow at a CAGR of 66 percent. That means from 121 exabytes in 2010 to 133 exabytes a month in 2015. Business traffic will triple but consumer-driven traffic will grow five-fold.

Another measurement of so-called “Big Data” is the total amount of global data-what’s on disk drives instead of Internet traffic. According to International Data Corp., Framingham, Mass., that is expected to grow to 2.7 zettabytes during 2012, up by 48 percent from 2011.

What makes the information avalanche unnerving is that it means engineers and other product development and support personnel may be forced to spend even more time looking for information. Analytics by itself may not be enough change this metric. Analytics within a sound PLM strategy can, however, keep users from floundering in all that data.

The Challenge of Unstructured Data

Before designing a new product or a vital component, every engineer searches for insights as to why previous similar products succeeded (or failed). Much of what these engineers plow through is unstructured and this is the real analytics challenge. Types of unstructured data include scanned images, photographs, faxes, voice mails, blog entries about companies and products plus CAD files, and even meshed geometry from finite element analysis (FEA).

Researchers at multiple IT analyst groups concur that unstructured information accumulates exponentially-two or three times faster than structured information-and constantly changes. Until recently, extracting useful information from unstructured data often required talking with the creators of the files.

Some specific examples of unstructured information that engineers work search for every day:

- The geometry of prior products, and of similar current products, usually from solid modelers.

- Field reports on failures, causes, fixes, and maintenance records.

- Anticipated service conditions and likely stresses and strains on components.

- Mechanical-properties data and test results that support FEA load vectors, and boundary conditions, etc., and for manufacturing simulations.

- Inspection and measurement reports from production.

- Invoices, purchase orders and specifications from suppliers.

- Traceability reports for component materials and bills of materials (BOMs).

Engineers also need to search the specifics of relevant industry standards, widely agreed-upon best practices, legally enforceable codes and regulations. These are unique to every industry: offshore oil and gas drilling equipment, cars and trucks, aircraft, spacecraft, machinery of all types, medical devices and electronics; pharmaceuticals and food and beverages; consumer packaged goods and apparel; and the myriads of manufactured components that make up the nation’s infrastructure. Codes and regulations have detailed reporting mandates at local, state, national, and international levels.

Finding this kind of information requires patience plus skills in organizational structures, digital networks and databases. Uncovering hidden connections among seemingly unrelated items-the more important value proposition-requires powerful analytics tools. These tools are a focus of much development in areas such as ad-hoc report formats, key performance indicators (KPIs) to help keep searches relevant, and customizable dashboards.

The new dashboards generate intuitive, easily understood visuals for busy decision makers and help ascertain that only current, verified information is being navigated. Dashboard innovations also help align business processes with real-time as well as archived data.

Structured data, in contrast, is less voluminous (also according to IDC), somewhat slower growing and much easier to handle. Spreadsheets are a prime example of structured data; data in rows and columns is easily searched, retrieved, parsed, and reformatted. Also comparatively easy to search are text-based files and e-mail.

Happily for those with PLM strategies in place, the analytics software developers compete vigorously on the breadth of search capabilities, scalability for growing enterprises, robustness to handle the data tsunami, “granularity” for really deep dives, ease of installation and use, and cost.

The Data Tsunami and Intellectual Property

Making sense of the data tsunami in PLM requires focusing on intellectual property (IP), that part of an organization’s cascade of information deemed essential to business survival. While “essential” varies from enterprise to enterprise and from industry to industry, commonly accepted definitions of what constitutes IP are expanding steadily.

Not long ago, IP was thought of legalistically-patents and trade secrets, certain financial data, and employees’ personal data. Proprietary information such as production data, test results, and field-service reports were added. But even those IP add-ons are small when compared to:

- The U.S. Federal Rules of Civil Procedure mandates that in civil litigation opposing counsel must be granted access to electronically stored information such as e-mail, voice mail, “instant messages” (IMs), and the contents of “smart” phones and personal digital assistants (PDAs). Noncompliance can lead to onerous penalties. Judges can place vital enterprise data “off-limits” to its owners and, paradoxically, put that same data in the public domain during a trial.

- Directives aimed at regulating environmental sustainability, originally adopted in the European Union. These include Restriction of Hazardous Substances (RoHS); Waste Electrical and Electronic Equipment (WEEE); Joint (electronics) Industry Group standards on reporting usage of hazardous materials (JIG); Registration, Evaluation, Authorization and Restriction of Chemicals (REACH); End of Life Vehicle (ELV); the Global Automotive Declarable Substance List (GADSL); the U.S. Consumer Product Safety Improvement Act of 2008 (CPSIA), and others. Added to these are bans on “conflict minerals” mined in war zones, and rules on batteries and packaging. Noncompliance can be dealt with harshly, and publicly.

To summarize, engineers, managers, and others involved with PLM need to be concerned about finding the best ways to benefit from the data tsunami. Analytics needs to be embedded in PLM strategies as well as in business models. On the other hand, fear and its fellow travelers uncertainty and doubt offer no solutions whatsoever. They constitute no more of a strategy than dithering about the approach of a tsunami.

As analytics and PLM advance in tandem, their convergence is both inevitable and welcome. Analytics within PLM offers another solid solution to the unceasing pressure of global competition and the imperative to achieve more with less. That overworked phrase means creating new products sooner than the competition does, products that are better than those of the competitors, and then to build sustainable competitive advantages from successful products.

The empowerment of analytics can make this happen. It is becoming an “essential” of PLM.

PLM-Linked Analytics Capabilities

Examples from leading PLM solutions providers:

Siemens PLM Software. In the Teamcenter portfolio is a new product called Active Workspace (AWS). AWS is a multi-disciplinary visualization, collaboration, and decision-support platform to access entire PLM ecosystems. AWS has been designed to aggregate data from multiple enterprise systems, data repositories, and Teamcenter itself to provide business-intelligence solutions. Embedded reports offer the ability to quickly drill down (or up) to the proper levels of information decision makers need.

SAP. In-memory technology, which moves data to be processed into the analyst’s system’s random access memory (RAM); traditionally data was accessed from individual disks where it was originally created or where it is archived. The first such product is SAP’s High-Performance Analytical Appliance (HANA). HANA is a flexible, data source-agnostic “in-memory appliance” to analyze high volumes of transactional data in real time. It includes tools for data modeling, data and lifecycle
management, security, operations, and others. Proponents cite enormous performance gains in response time, obtaining useful data in minutes instead of hours.

PTC. The focus at PTC, based on two success stories, is supply chain visibility to ensure regulatory compliance: including RoHS, REACH, JIG, and ELV. Motorola uses PTC’s Windchill Product Analytics to track and control restricted materials and substances in more than 50,000 wireless communications products and hundreds of thousands of supplier parts from thousands of suppliers in Motorola's global supply chain. At APC by Schneider Electric, a maker of electric power distribution systems and controls, Windchill tracks compliance with RoHS, REACH, and JIG in 300,000 parts from more than 1,500 suppliers. Additionally, the system verifies compliance for 6,000 products with about 500 different national laws. APC Schneider says Windchill enabled them to quickly collect 25 times more supply chain data than its old manual-based methods while reducing overall the cost of compliance by 80%.

Oracle–Agile. Oracle’s Agile PLM solutions enable companies to innovate profitably across its product value chains. Agile PLM helps organizations to rapidly commercialize products through integrated portfolio and business planning, enterprise cost, quality and compliance management, analytics, and clean product data publishing through a seamless integration with other product value chain management solutions. With a solid track record of references and time-to-value focused implementations, more than 1,500 global companies rely on Agile PLM solutions to help them align their product value chains and achieve their innovation goals.

Dassault Systèmes. Exalead from Dassault Systèmes provides business process specific information access solutions called search-based applications. From interactive analytics dashboards or web search-style queries, users gain access to intelligently organized information that foster better decision making in day-to-day operations and in executive-level strategies. Exalead applications use big data-scale indexing, semantics-driven data processing, and customizable drag-and-drop user interfaces to manage up-to-date information. PLM, ERP, the web, social media, and data warehouses can all be indexed by Exalead’s products to provide broad decision-making contexts. Acquired by Dassault Systèmes in 2010, Exalead claims 300-plus user companies worldwide.

 

Become a member of the CIMdata PLM Community to receive your daily PLM news and much more.

Tell us what you think of the CIMdata Newsletter. Send your feedback.

CIMdata is committed to your privacy. Your personal information will never be sold or shared outside of CIMdata without your express permission.

Subscribe