Why Better PLM Analytics Equals Better Products

Full transcript below:

Doug McCormick:

Hello. Welcome to “Why Better PLM Equals Better Product” with Jeff Hojlo and Heatherly Bucher. This IEEE Spectrum Tech Insider webcast is brought to you by Arena. I’m your host, Doug McCormick.

Before we start, let’s mention a few housekeeping items. First, the most common question: Yes, this presentation will be archived. The recording should be posted about 24 hours after we close. You’ll be able to find this session and other Tech Insider presentations at www.spectrum.ieee.org/webinars. There’s also a Webinars option under Resources in the Spectrum nav bar. We will send all registrants an email when the archived webinar goes up so that you can revisit it or share it with colleagues.

Second, we do encourage questions. We’ll answer them all after the talk, but you can submit them at any time during the discussion. Enter them in the Q&A box on the left side of the conference window and then hit submit.

Third, a word about the interface: You can enlarge your slides by clicking on the green rectangle at the top right of the slide area. If you inadvertently minimize it, you can get it back again by going down to those icons at the bottom of the page and clicking on the yellow slide icon that’s the second from the left. If you’re listening over your computer speakers, you can adjust the volume in the media player area at the upper left of the screen. And remember, you may need to adjust your system’s master volume.

The icons at the bottom of the screen also include a green file folder second from the right. This links a resource page from which you can download PDF copies of today’s slides. And finally, this webinar can help you earn PDH and continuing education credits. At the end of the session, we’ll put up a code and URL that you’ll need to sign up.

And now let’s introduce today’s speakers. Jeff Hojlo is Product Innovation Strategies Program Director at IDC. He has 20 years of IT experience and has concentrated on product lifetime management for the last 10. Before joining IDC, he was global marketing lead for Siemens PLM Software, and earlier he was product and innovation in PLM enlisted AMR Research. His earlier experience includes marketing roles in companies like Engage, RSA Security, and Performaworks.

Heatherly Bucher is Senior Product Marketing Manager at Arena. Her 20 years of enterprise application experience includes early toolkit days of Metaphase and PTC at Iomega Corporation and runs through Agile and now Arena. Her expertise includes building global teams to implement service and support customers on PLM systems and other enterprise applications. And with that, let’s turn the virtual podium over to Jeff Hojlo for why better PLM equals better products. Jeff?

Thank you, Doug. And hello everyone, wherever you are in the world. Welcome to my segment of today’s presentation, which I entitled “Digital Transformation and PLM Analytics,” fitting because we live in a time of business and technology change as well as innovation. And we think analytics plays a very important role in this. Just briefly, I’m the Program Director of Product Innovation Strategies at IDC, where I write and research product lifecycle management and related technologies. And I’ve been doing so for a number of years.

So, let’s get into my agenda here. Three points. One, I’m going to start off with market trends that are driving the analytics imperative as we see it here at IDC. What is some of the enabling technology, not just technology applications, but also the platforms and new approaches to developing products that we see enabling manufacturers today? And then we’ll wrap up with what we call IDC essential guidance if you are a customer of ours and you read our research. At the end of every research document, we have a section called essential guidance, which is basically the key takeaways to the research project. So, we’ll try to do that today.

So, let’s get into the market trends and I’ll just build this out here. Today is a time when we’re dealing with a tremendous amount of change and complexities and many need to know how to address those areas such as product complexity, as products and devices become smarter with increased mechatronic and software content within them, dynamic customer demands as mass customization and configure-to-order become more prevalent across all industries, multi-dimensional supply chains necessary to address the aforementioned product complexity demand, as well as speed-to-market demands.

And concurrently, it’s a very competitive global marketplace and companies need to create new customer experiences because customers don’t just buy products. They buy the experiences they have with those products as well. Improve operational efficiencies, i.e., improve your productivity, generate new revenue streams, which after all is why you digitally transform in the first place as an organization. And then finally, respond to changing conditions. It could be product demand. It could be competition. It could be market fluctuation. It could be supply chain issues or it could be customer needs, new customer needs, or changing customer needs.

Those companies that succeed in the new digital economy, we think, will be organizations that can execute on their vision for digital transformation, of course. Forge customer loyalty by blending the digital with the physical—that is, leverage 3rd Platform technologies as we call them here at IDC, which include Cloud, mobile, analytics, and social business related technologies to enhance experience, leverage information for competitive advantage. And we talked about creating new revenue streams. One opportunity is around connected products and systems, which enable a flow of information about product performance and usage that could be leveraged to improve products, respond more quickly to issues, and innovate.

The underlying theme to all of these is speed of information and decision-making. Products, manufacturing processes, supply chains, and demand are all complex and connected with a tremendous amount of information flowing between them. So companies need the ability to manage, analyze, and consume this information for business value and through the opportunity for product lifecycle analytics.

The following slides will have a few quotes from business leaders that you may recognize highlighting the new reality in today’s digital age. The first one here, and this is fairly prevalent out there, you probably have come across it before. But if you went to bed last night as an industrial company, you’re going to wake up today as a software and analytics company. So companies that succeed over the long term because they change, they transform, and GE is a great example of that I think. They realize that the complexity and connectedness that exists is driving the need for analytics. So good decisions can be made and they can grow as a company.

Here’s Mark Fields, CEO of Ford, saying that the car is becoming the ultimate technology product, and we are becoming more of an information company. And that was really evident I think. If anybody went to CES this year, you saw many automotive companies, they had actually a pavilion set aside for automotive companies, as well as their high-tech Tier 0 and Tier 1 suppliers that are working very, very closely with them to innovate those cars. They really are becoming, in a way, high-tech devices on wheels.

And my final quote here is from Brian Krzanich from Intel. Really, Intel is working at the heart of making products smarter. You think about the increased amount of software and mechatronic content that exists in those products. Intel is really an enabler of that. And at CES again earlier this year as well, he confirmed that he’s also seeing customers buy products based not on product features, but the experience they have with that product. I’ve heard many CEOs and people in the market talk about it; people are choosing experiences over products, but I thought it was really telling that the CEO of Intel is talking about this as well.

So, let’s I guess take a step back right now, and let’s define what IDC means by digital transformation. Companies are transforming and extending, so they have new sources of innovation, and can meet customer need for options and experiences, really the first leg of it. Secondly, companies are using information to make better decisions faster, resulting in better design, better collaboration, better manufacturing, and better quality products. And finally, they’re using, as I mentioned, 3rd Platform technologies and innovation accelerators. We call them innovation accelerators at IDC, which include areas like the Internet of Things or IoT, robotics, and printing. The manufacturers are connecting their organizations, products, and value chains to improve productivity, competitiveness, and societal benefit.

Let’s talk about how innovation is digitally transforming, and again, I’ll just build this slide out. So product and value chain complexity, and I touched on this a little in the first slide. But increasingly, manufacturers across the industry are participating in complex extended value chains. Dynamic change due to increasing product complexity, customer demand, and service efficiency across these value chains is typical. It means manufacturers need to take a systems engineering approach to product development while tying product tightly to demand and supply.

We refer to this extended view of PLM as the product innovation platform, and I’ll talk a little bit more about that new approach later in the presentation. Connected everything, connectivity is ubiquitous, products or connected value chains are connected and customers are more connected to brand owners. The expectations of speed to market with quality products that can be serviced quickly, are higher than ever, meaning manufacturers need a way to collaborate easily and make sound decisions quickly.

Making sense of data. Just kind of why we’re all here today, I guess. Manufacturers have a wealth of information across the enterprise that if managed and analyzed in a unified fashion, can drive product and service innovation. They need to be able to trust this information such that it can be used to accurately verify customer demand, determine the right mix of suppliers, invalidate products and process quality, which leads me to product quality. We conducted a product and service innovation survey last year, and the top area of focus around product and service is quality. Furthermore, when we asked what their cloud plans are, they said quality is a top process to roll out in the Cloud. The growth of complex smarter products that have an increasing amount of software within them not only increases the difficulty of building that product, but also maintaining and servicing it.

Just a little bit more on innovation and how it’s digitally transforming, convergence of information technology, operational technology, and communications. And in their efforts to cultivate greater customer and asset intimacy, manufacturers are increasingly adding smart functionality to products. They’re adding IoT technologies such as telematics, and sensor-based remote monitoring for asset health, diagnostics and predictive services, and mobility, social product development, production, and service can be the bridge between products, assets, and engineering.

Sustainability and compliance. Compliance continues to be a driver with Tier 4 emissions and heavy equipment, eco-design for energy efficiency, directive and industrial machinery, and regulations have been around for years like ROS [inaudible] reach and high tech and process industries. And increasingly, manufacturers are paying attention to the energy efficiency of both their product and plant, manufacturing plant as good business. Our survey data shows reducing the amount of energy required to manufacture and operate a product as top areas of focus.

Increased competition in new markets, always, always a big issue with CEOs. Global competition remains one of the top things that keeps executives up at night according to our research, and from this flows the natural questions of how do I get products to market faster to penetrate new markets that have unique demand and needs?

And finally, customers expect an increasing level of service. Customer experience is increasingly being viewed by manufacturers as a necessary element of B2B transactions, one that is enabled by the 3rd  Platform technologies, as I mentioned, innovation accelerators.

Let’s talk a little bit about the enabling technology that I mentioned. And here’s the model that we have published and been writing about and researching and talking with manufacturers about, which is really the next generation of PLM called the product innovation platform. Really look at this as a new approach to addressing the transformation and complexity that we’ve been talking about. So what exactly is a product innovation platform? It’s PLM extended to other enterprise systems, data sources, processes, and people for a unified view of all product and value chain information before and after market. Ultimately, we think this approach leads to product development that is demand-oriented, data-driven, and digitally executed. And you can see the PLM is core to the model here. Given the number of processes and data sources that touch products, you can understand why we think analytics is a core part of this extended approach to PLM. You can see product analytics is part of those core capabilities.

It should be noted that analytics needs to be a key part of every PLM platform no matter how extended it is. Small companies can benefit by an analytics tool that combines, for example, reporting and analytics on the development process, supplier performance, and product quality information. So there are different levels, different approaches to achieving an extended view of PLM enabled by analytics.

Let’s look at what our data is showing regarding analytics and related technology areas. We have a group at IDC that focuses their research efforts on big data and analytics. And this is a slice of data from a survey they fielded that shows product and service improvement and innovation as the top reason companies invest in analytics—I thought obviously really compelling for this presentation today, and we fielded a study last year on, as I mentioned, product and service innovation practices and plans that further highlights the need for analytics to support innovation.

I’ll show some results in the following few slides. Let’s look at this first one here and this survey. We asked two questions in particular that are telling. What are your top drivers for your PLM efforts? And what are the top drivers for your SLM or service lifecycle management efforts? We really look at IDC at product innovation. And also the aftermarket servicing of products is really two areas that need to be connected. For PLM, a clear winner is product quality. Not surprising when you consider the innovation trends previously discussed, such as product complexity, global competition, and the fact that products are now more connected than ever.

Let’s go to the really interesting slide here. I think in analytics, we also asked what processes does or will your PLM and SLM analytics tool support that is analytics capability with either outside of a PLM system or could be embedded within a PLM system. We didn’t differentiate from a product standpoint, clearly, manufacturers think they need analytics to track, predict, and improve product performance, quality, and service.

Also, manufacturing execution was a prevalent response. I thought that was interesting. Why manufacturing execution? I think it really derives from products. Because products are complex, manufacturing processes are hence complex, and time to market and demand pressures are great. Tracking and tracing, therefore, how quickly and efficiently a product is made is critical.

And then let’s talk a little bit about Cloud here. I think the last slide I’ll show here, here we asked what elements of PLM and SLM do you deploy or plan to deploy in the Cloud and supplier collaboration was not surprisingly one of the high areas. I think this has been so for cloud-based PLM for years. Secondly, design or formulation review, as one of the highest responses here I think is an indication of an increased comfort level with the Cloud among manufacturers, both from a performance as well as a security standpoint.

And you see again, quality management, the why in the Cloud, why quality in the Cloud. We think this is due to the computing power available in the Cloud today. And simply put, the need for speed when it comes to addressing quality issues.

And just one final point on this, there’s a lot going on here and in the aftermarket, as you can see, as manufacturers look to manage service planning, field service management, customer support in a cloud environment—and so just highlights really the overall need for what people are looking for when it comes to cloud-deployed systems.

So, I’ll get a little bit into the value of PLM analytics. What are some of the specific use cases for PLM analytics as we see them? And I’m sure Heatherly will speak more from the Arena perspective on this. Let’s talk about a few of them. So product compliance. This is really to monitor and restrict material used in product portfolio management. Manufacturers with increasing varied portfolios of products need the ability to optimize variants, product options, and mix. Think of that as part of the analytics process that you need to employ when you’re in the new product development and introduction process.

Product costing, the ability to model cost during the front end of innovation based on material, supplier, manufacturing process, or third-party logistics is an important capability any PLM analytics tool should provide. Product quality, we’ve talked about that a number of times. This is the top reason why manufacturers use PLM, so your analytics tool better do a good job at managing this to enable faster service to customers. Determine the root cause of quality issues and use this information to improve and innovate products over time.

I think quality information is used by a lot of manufacturers today in a reactive fashion to deal with failure modes as they come up, software issues that come up, updates that need to happen. I think the opportunity is to use that historical information over time to improve products over time and even innovate new features or maybe brand new products. And then finally, improve processes. PLM systems for years have done a good job of reporting on the development process and resource performance. As the product development value chain expands, and engineering works more closely with manufacturing, close management of design and engineering development, manufacturing planning, as well as manufacturing execution and efficiency rise in importance.
 
Let’s get to the final section here of my part of the presentation: essential guidance. To wrap up my presentation, I’d like to leave you with five key takeaways as you consider applying product lifecycle analytics to your business. And I’m going to turn it over to Heatherly of Arena. So, digital transformation and connected products are transforming innovation overall. Manufacturers, customers, suppliers, and partners simply are better able to collaborate quickly to meet rapidly changing demand and market conditions.
 
Secondly, analytics can be a bridge connecting your enterprise and value chain empowering. What I’d like to say, I call a vibrant product innovation platform really connects all the pieces together as depicted in that model I showed. We see analytics as the glue that really holds your matrix enterprise and value chain together during product design, development, and introduction. Ensuring product and process quality are key reasons why analytics are needed in PLM, though manufacturers of all sizes can benefit from PLM analytics that track the performance of processes, and by that I mean processes related to design, engineering, sourcing, supply chain, manufacturing, service, as well as products, and that could be failure information, could be customer complaint information. It could be software updates required.

Fourth, product analytics are cross-functional. The information needs to be easily accessible by role in the user’s environment. But information should be embedded in the user system so they have the information they need when they need it. And then finally, consider a cloud-based analytics tool to expedite information sharing and lifecycle optimization across your value chain. As I showed earlier, our survey data shows that manufacturers are considering a cloud environment to view service, performance of products, and quality information so they can expedite action and improve customer experiences.

And so, I thank you very much for your attention and the slides will be available as Doug mentioned. At this point, I’ll turn the presentation over to Heatherly from Arena. Heatherly?

Heatherly Bucher:

Great, thank you very much, Jeff, for setting the table about the trends we see in digital transformation in the product area and how important analytics is in that product innovation platform or extended PLM. My part of the session today, I titled “Achieving the Promises of Analytics.” At Arena, I am in the product marketing area, and I spend a good portion of my time talking to our customers, our prospects, as well as our implementation consultants, our product managers, in helping translate our products, its features, and capabilities into the information related to the business processes that our customers deal with every day.

And so, my part of the presentation today, I want to look at some of the risks in data analysis. It’s a hot topic, everybody talks about it today. Everybody wants to achieve it. But as we’ve worked with prospects and customers, what we found is that there are some risks involved, that if you know about these risks, you can avoid them by selecting good tools, by ensuring your data sets are there, and by your approach to data analysis.

We’re also going to look at some or three areas, again that if you’re aware of, will help you be more successful with your data analysis. I’m going to do something a little different today. We’re going to look at those risks and levers not necessarily from a product view, but from other industry data analysis, and then we’re going to bring it back to Arena’s view of product analysis, some examples of Arena analytics, and then a little bit of a comment about how we see product analytics progressing in the future.

Thirty years ago, Nick Postman, in “Amusing Ourselves to Death,” wrote about the dangers of technology at the time, which was TV and radio, and he contrasted the visions of George Orwell and Aldous Huxley, two writers you may have heard of. Orwell feared a captive culture, whereas Huxley feared a trivial culture. Orwell worried about truth being concealed, books being banned, dangers in gathering information, security, and safety. Huxley instead worried about dying in a sea of irrelevance with books unused if everyone became passive, so much more information that we didn’t know what to do with that, or in summary, either Big Brother’s watching, Orwell’s view, or we’re watching Big Brother, Huxley’s view.

In reality, we can make today and tomorrow better than both of these visions. What’s exciting about data big or small and the technology surrounding us today is we have the opportunity to share meaning and change our behaviors, both at work in our companies as well as in our personal lives. But there are challenges and risks. So, we’re going to look at five risks of data analysis that need to be considered for any analysis, product quality or otherwise. But before we do, we want to give you an opportunity to participate in one of two polls today, also an opportunity to wake up a bit.

So, you should have on your screen a poll. What data analysis projects do you have in your company today? You can select all that apply in the next minute. Product development processes, change management cycle time, root cause analysis, supply chain health, manufacturing processes, customer-focused, other, or none that you know of. You may need to scroll if you need to see all those options that I read off. And we’ll give you a minute or two to click the button, take the survey, before we reveal the results.

Okay, looks like we have a fair number of you with results. So, let’s take a look at the results. Product development processes, change management cycle time’s high on your data analysis, manufacturing process is high, root cause, some customer-focused, and some supply chain. If we take this survey six months, 12 months from now, it may look different as the Gartner survey from last year found more than three-quarters of companies are investing in big data or data analysis in the next two years. So, if your company’s not doing it today, your company may be doing it very soon.

Let’s talk a little bit about data analysis and risk. In 2001, Doug Laney, who was at Gartner at the time, coined the term the Iron Triangle of volume, velocity, and variety with regards to data. And at the time, and just not too long ago, 2000, 2001, it was difficult to harness all three aspects. It’s difficult to get a lot of data, lots of kinds of data, to connect, and get the data quickly and do all of them cheaply. Today, with expanding algorithms software, processing power, increased datasets and size, complexity, and connectivity, we can get all three relatively cheaply. However, it doesn’t mean that data analysis is going to be easy or quick. It can still be a challenge. Done well, it can change behaviors and business outcomes absolutely. Done poorly, it can annoy, distract, or lead to bad business choices.

So, let’s take a look at five risks of data analysis. In 2009, a team at Google announced they made a remarkable achievement. Without needing the results of any patient visits around the U.S., they could track the spread of the flu and do it more quickly than the Centers for Disease Control and Prevention with only one day’s delay compared to the week or more it took the CDC to create a picture of the flu spread from doctors’ records. You may have heard about it. It’s called Google Flu Trends.

It was quick, accurate, and cheap data analysis. However, it was also very free. The Google research team didn’t bother to develop a hypothesis on search terms. For example, did the search term “flu sentence” or “pharmacies near me” actually correlate with the spread of the disease? So, four years after the launch, Google Flu Trends died because its fast model pointed to a severe outbreak of almost a factor of two compared to the CDC’s slow model that did not forecast a huge flu outbreak that year.

What was the problem? Google could not begin to know what linked the search terms with the spread of the flu. The engineers were finding statistical patterns in data and they cared about correlation, but not causation. Sometimes today, business-intelligence data analysis proponents tell us that if we collect enough data, big data, then sampling techniques are not needed, cause-and-effect relationships are unnecessary. And with enough data, the numbers will just speak for themselves. However, in product design world, think of the change process approval as an example. We often hear that the visibility into change process bottlenecks will solve some time-to-market challenges, which is true, but a little later in the presentation, we’re going to look at the change approval analysis process and see what happens if we have correlation without causation.

So, the first risk of data analysis is correlation without causation. Let’s take a look at the second and third. In 1936, Republican candidate Alfred Landon stood for election against President Franklin Delano Roosevelt. The Literary Digest at the time was responsible for forecasting the results. It conducted a postal opinion poll in ‘36. And with amazing ambition, they wanted to poll 10 million people, a quarter of the electorate. It fell short, but it did tabulate a staggering 2.4 million returns and announced that Landon would win. The actual election results, as most of us probably did not know Alfred Landon’s name before today, we know that Roosevelt won, crushing Landon. To add insult to injury, a much smaller poll by a new opinion poll pioneer, George Gallup, predicted victory for Roosevelt. So what did Gallup get right and the Literary Digest get wrong? Gallup knew that when it’s data, size is not everything. Data has other values in addition to size, the validity of the data, the type of data, the method of collection, the speed of collection, and data gap.

In addition, the Literary Digest forgot about data bias. In its quest for bigger data, they mailed their poll to a list taken from automobile registrations and telephone directory, which in 1936 meant disproportionately prosperous people. We might think we won’t make the same mistakes today. Unfortunately, we can and we do.

So, let’s look at a current example that introduces us to our last two data analysis risks. We’re going to consider Twitter and Facebook and the many companies and institutions looking at this fire hose of the data, big companies returning to these datasets. Thomson Reuters now incorporates sentiment analysis gained from Twitter for its market analysis and trading platform. We’re in another election year, like 1936. And we can look at many polls, including now analysis of Facebook “likes” for prediction on election results. This particular screenshot, you see, was compiled by the very interesting group at 538.com. But are Twitter or Facebook users representative of the population as a whole, or is this another case of sample bias?

A Pew Research survey in 2013 found that U.S.-based Twitter users are disproportionately young, urban or suburban, and African American. So what are some problems with this particular analysis? Again, we have simple bias. But the fourth risk to data analysis is lack of context. Data is united through context of people, processes, actions, and even other data sets surrounding the data in question are necessary to interpret data. As just mentioned, the ever expanding extended PLM or product innovation platform captures more and more of that data and processes needed the context to interpret data. In this case, Facebook “likes,” what does that mean? Were 10%, 20%, 50% of the Trump Facebook postings actually negative postings about him? If so, what does the “likes” mean? Finally, the last data analysis risk is not connecting the data analysis itself to value. What is the value of the resulting analysis? For businesses, we have to think about what questions, if we know the answers, will impact our product quality or performance or time to market? And why will it do so? How will we use these answers? So we have five data risks. Let’s look at some data analysis done well to see three data levers that will help us be more successful.

In 2007, Anne Milgram became the Attorney General of New Jersey. When she did so, she began by asking some basic questions. Who were they arresting, who were they charging, and who were they putting in the nation’s jails and prisons? And finally, she wanted to understand if they were making decisions in a way that made everyone safer in New Jersey. But she couldn’t get the information. Why? Because she found out that most big criminal justice agencies didn’t track the things that mattered. For her team, it took a lot of manual work to get the data initially and it turned out that what they were doing wasn’t very good. When she investigated, she saw well-meaning detectives and officers trying to fight crime with Post-it notes on whiteboards, no data-driven effort.

So, she started thinking about how do we make decisions. What she found and what might resonate with you is that much of the time, we make decisions based on instinct and experience, and maybe Post-it notes. So, what Anne Milgram decided to do is that New Jersey would Moneyball justice, the Oakland A’s approach to change how to pick players that would help them win games, which became a very famous method and then went on to be a movie. It worked for the Oakland A’s and it worked for the state of New Jersey.

They reduced all crime by 26%, murders by 41%. And they went from focusing on low-level crime to cases of statewide importance. How did they do it? They focused on data and analysis. And by doing so, the team determined the questions to ask of the data. They originally brought together 900 risk factors, a crazy number, but they narrowed it to nine that mattered—nine factors that predicted the things judges wanted to know. Will someone commit a new crime if released? Will the person commit an act of violence if released? And will the person be back in court if released?

So if we look at this, we look at the New Jersey example, we can use datasets to drive to insight that lets us solve problems, whether those problems are social consumer marketing or product-related, but to do so requires discipline and time, not just sets of data.

Second, we need to know what we really need to know. It’s an impossible acronym to pronounce so I don’t think it will catch on anytime soon. But I found that for my work with customers over the years implementing PLM, focusing customers on this as a guiding principle is critical to success when enabling business processes, helping companies figure out what’s going on with their product-related processes and the quality of their product. Have a purpose for the questions like the New Jersey justice system or the Oakland A’s.

Finally, you need to be reasonable. Last year, Gartner in a survey identified four best practices that BI (business intelligence) and analytics leaders need to get analytics initiatives going. In that report, Gartner suggests that day-to-day operations and tactical decisions are a good place to start because where there’s a lot of data complexity and people, and there’s opportunity. Gartner’s recommendation was to buy packaged applications because those companies do not have the current skill sets and tools necessary, and maybe we’ll decide to never build to that level internally. Data analysis and business intelligence is becoming the new normal, and that’s a good thing. It means that businesses can begin to approach it as a standard practice, but it doesn’t mean that it’s easy.

Before I start talking about Arena’s perspective of data analysis, we have our second survey, second of two on your screen, so you can select all that apply. Again, you may need to scroll down if you don’t see all the options. I will read them out for you. What quality initiatives is your company pursuing? Continuous improvement, design for manufacturing, increased functional testing, supplier corrective action, CAPA or similar, in meeting specifications requirements, other, or no quality initiatives at this time? I’ll give you a few minutes to find your mouse and click the boxes.

And let’s take a look at the results. So the quality initiatives our audience is pursuing, we have a lot on continuous improvement and meeting specifications and requirements; a healthy number for design for manufacturing, increased functional testing and supplier corrective action; some CAPA, you may have some medical device in our audience; other, and only a few have no quality initiatives going on right now.

I like this quote from David Garvin, Harvard Business School. If you haven’t read any of his writings, he has some very fascinating writings on quality. And one of the things that he continues to state is that in order for you to manage quality, and I would say in order for you to analyze quality, you first have to understand it.

So before we talk about data analytics in the products world, we need to define the products world or the product record and as Jeff talked to you in his slides, the product record has expanded during this digital transformation as well as the technologies that we can actually use to do business. So, for many of us who’ve been around for a while, the product record originally was defined as a bill of materials [BOM] or maybe the bill of materials and its associated physical files, approved manufacturer relationships, and engineering change orders.

However, to design, produce, and improve innovative products, we have to think of the product record differently or more broadly. The products record today includes not just the BOM and its associated files, but also requirements management, quality and CAPA, product projects, software, which in the world of the Internet of Things, more and more products have, as Jeff discussed component management and compliance, hardware defects, team training on product areas, tooling methods, and requirements management, and more. And the product record is in a never-ending cycle as we know from concept through release to sustaining.

For the purpose of product analytics, that dataset has just grown considerably when we look at the product record in this way, and then we need to consider the context, the people, and processes that touch the product record every day. If we can have a single, unified source of truth of the product record, we reduce user errors, miscommunications, and lack of visibility. Then we’re setting the stage for better product analytics efforts to increase the strategic value by reducing those quality issues, accelerating NPD, which is ultimately about time to market, and improving profitability.

So, what is Arena’s view of product analytics? Or from the flip side, what product and process performance analysis should you demand of your PLM solution if you have one or if you’re shopping for one? First, we believe that data sets itself needs to be there. The data needs to be structured, you need to avoid data gaps and include the complete product record growing over time, and as people participate in product processes. If you can do that, you avoid the risks that we talked about with regards to data values, bias in data, and the context, and you can lever that, know what you need to know with a set of questions that you define.

Second, as we saw and looking at the successes and failures in data analysis projects, Arena believes data analysis is not simply reporting. It should be actionable and add value. This requires that you have a thoughtful plan regarding analysis of product processes and/or a tool that has thought this through for you.

Third, Arena appreciates the difficulties involved in doing data analysis well, as well as the need to gain value as quickly as possible without having an army of software programmers or data analysis experts on staff. So we’ve partnered with Good Data, a leader in data analysis engines to build our product analytics. Coupled with our products record data set, this analytics engine ensures our customers begin with a solid foundation. Additionally, our approach allowed our customers to connect their complete product record in Arena with additional external structured datasets for more strategic business analysis. Finally, Arena wants to Moneyball product processes and quality concerns. Our Arena analytics product begins with the questions most valuable to business.

So, let’s finish by looking at some common business scenarios you may have encountered, and how product and quality analytics done well could provide insights for decisions today as well as digging a little deeper for predictive or future decision-making. Remember, I talked a little bit earlier about change approval bottlenecks as being a common question. We often get asked about change approval processes. When we talk to customers, they want to know where are the bottlenecks and who are the bottlenecks.

So in the Arena analytics dashboards, we can create those metrics and analysis for us, and we can look to our bottlenecks or where are bottlenecks in our change approval processes. In this case, let’s say we look at it and we see that two of our team members, Charles and Frank, are bottlenecks. It takes them two or three times longer to take action on changes than anyone else on the team. We might decide to reassign, reprimand, or remove these two slow change approval members from the equation. However, if we do that, it could be that product quality starts to decline with more nonconformances, rework, scrap than we had before. Why? We might have forgotten about correlation without causation.

Remember Google’s example, we might need to go a little deeper in our analysis. Perhaps they were the only people actually reviewing changes as everyone else on the team should have. So quick analysis sometimes gives us interesting information but better analysis and context, we might consider, for example, not just the actions that our team is taking on changes but other behaviors such as reviewing, opening files associated with changes. And then we might find that actually, Charles and Frank are our best change approval team members and we need to do some additional training with the other team members.

Let’s consider our quality analysis need. With Arena, teams can manage the quality processes, closed-loop corrective actions within Arena connected to the bill of materials. If you are a quality team member or a quality director, one of your common questions might be what quality issues need attention today. Increased visibility lets you see the trends as they occur in Arena Analytics. In this case, we may come in and look at an analytics dashboard, and we may see a spike that has occurred over the last few weeks on a new product line. We could consider that that spike has occurred and we have some immediate containment severity three issues. With an Analytics dashboard, we can drill down, we can see the quality processes that we should be concerned about today. We could schedule that meeting or we can head off the issues. In addition, after that meeting, we may want to come back and look at a broader context. So perhaps, we can run some additional data analytics, considering quality issues for these particular product lines. We can look at trim across the top quarter, and we could consider some predictive analysis or maybe we have a related upcoming NPD project that’s getting started. So analysis can answer questions for today, but we need to also push our products and quality analytics to look at questions for the future.

Let’s look at another area of compliance. If you’re subject to any type of compliance requirements, then you need to track compliance details, which you can do in Arena PLM. But you also need to be able to see easily where you are compliant at any time for product lines, types of items, assemblies, and more. And you need to see this not just one time but all the time, sometimes daily for new products or when you have audits approaching. So in this scenario, maybe we have an internal compliance audit for our largest product line and our questions. Our first question is, are we ready?

So if we are looking at or tracking our compliance in that extended PLN and then we can drill down into our compliance metrics from Arena Analytics and we can find the particular items where we are not compliant. So in this scenario, we found a noncompliant item, which gives us information to head to that audit prep meeting we may have. But again, after that meeting, we may want to do some additional data analysis. When we have more time, we may want to consider: Is there a bigger problem brewing? For example, which supplier provided this noncompliant part, when did the part go noncompliant? What is the status of the other parts provided by that supplier?

Finally, we’re going to consider one last product quality analysis example. As you may be aware, 8D is a meticulous process used to solve complex problems. It was originally pioneered by Ford Motor Company. Many of our customers follow 8D, and they track these processes in Arena, which allows them to not only enable the process, but to capture all the data around these efforts in a way that allows us to ask questions about the data for both today and future preventative actions.

So, for tracking 8D processes in Arena, and let’s say in this case we have multiple sites, we might want to look at which of our sites is working on a lot of 8D processes and how are those 8D processes progressing? So in this quick answer, we might see that site D has a lot of 8D processes open, almost twice as many as any other site, and almost half of our 8D processes are in the very first step of the 8D process, which is defining the problem.

Is there a quick solution? What are we going to do with this information? Maybe nothing right now. We might need to consider context and if there’s bias. So how long have we been doing 8D and at each site? When did we train all the sites? What differences exist at these sites? If we are tracking our training records and our SOPs in Arena as some of our customers do, then we could continue our analysis and look at our training records for each site and against the SOPs. And we may find that site D, maybe we just hired a whole new group, a whole new team, they just got training. That would explain the spike in both processes as well as so many of them in that first step.

I’d like to do, in kind of summary, is share with you three thoughts that Arena has about the future of product and quality data analytics. The first thought is that big and small data analysis is now the new business norm. As you heard both Jeff and I talk about it, data analysis is going to be the way to move the business forward for quality of the product, time to market, reduction of cost. It’s going to be more predictive, it’s going to be more connected as that extended product innovation platform expands, and it’s going to be more valuable, but only if it’s actionable, only if the data analysis is done well, thoughtfully, and with questions that will impact your business. For better product quality, analysis must begin with understanding what quality is. And then the analysis has to be reasonable in tools and resources and scope. And finally, as you heard both Jeff and I talk about, traditional PLM solution borders are expanding, and they must continue to expand, so that analysis has the benefit of this complete products record and process data captured. At this time, I’d like to turn it back over to Doug who is going to moderate some questions that you have typed in. Thank you for your time.

Doug, I believe you have some questions from the audience that we can answer.

Jeff Hojlo:

Why don’t we, Heatherly, really jump in here? I think I see a couple of questions come in, a couple of quick ones that I can address. Here is again, yes, the slides will be available. And the second one—somebody asked about the analytics engine, it’s called Good Data, correct, Heatherly?

Heatherly Bucher:

That’s correct. We partnered, that’s right. We partnered with Arena Analytics, which has been bedded into our product as a product module, but the engine underneath, that is through our partnership with Good Data.

Doug McCormick:

Thank you. I apologize for the sound problem. Jeff, you mentioned that a product innovation platform approach to PLM looks good. But how realistic is that for most companies?

Jeff Hojlo:

I think it’s attainable and necessary for manufacturers of all sizes. And there may be different approaches per size company, per industry that you’re in. And all of that is because of all of that multidimensional complexity that we talked about on the product front, on the demand front, where customer demand is constantly changing. Customers have different preferences that evolve over time. Supply chains are very complex and extended in a lot of cases. Tier 0 and Tier 1 suppliers are becoming really innovation partners for manufacturers, I think of the auto industry and high-tech industry as well. But auto is working more and more closely with their high-tech partners to enable a lot of the features, new features that exist in cars today, the infotainment systems that exist, et cetera.

So, all of that has led to the point here combined with the fact that we have the technology advancement in place in 3rd  Platform technologies around cloud infrastructure, analytics solutions, mobile devices obviously are prevalent, and also social business-related technologies. I think you see a lot of PLM systems integrating into their UI, kind of a social media way of communicating with each other where they can attach files and images and communicate with their supply chain or internal people.

And so, to accomplish this price-value chain collaboration, you really need a broader approach to PLM. It’s PLM and engineering workgroup or PDM, with a few CAD tools tied in, it’s just not going to cut it anymore. So, one final point—I mentioned industry, I think that in different industries, maybe the starting point for product innovation, product innovation platform might be different. So you have PLM at the core, but in pharma and medical device quality management is probably the lead data set and the lead processes that drive that PLM approach. In the automotive and aerospace and defense and machinery, systems engineering and tying in with the service lifecycle is really important and high tech and semiconductor, quality management’s really important.

You go down the list and get into more consumer industry, it’s probably product portfolio management is very important. So that’s your lead approach. And so, do you need to achieve the entire spectrum of the model that I showed up there to have a product innovation platform? Not necessarily, but I think they’re just different approaches and different flavors, if you will, of each.

Doug McCormick:

Thank you. Heatherly, I’m not sure you answered this while my sound had dropped out. But you did mention training records in the 8D example. Does Arena track training records or would this be connecting data set?

Heatherly Bucher:

Okay, that’s a great question. Actually, before I answer that question, I do want to apologize. Someone else pointed out to me, the author of “Amusing Ourselves to Death,” the book I talked about in one of the first slides, is actually Neil Postman, not Nick Postman, I apologize. And it is a great book. You can pick it up on Amazon Kindle or something like that—30 years old, but you would be amazed at how relevant it is for today.

With regard to training records, many companies, of course, need to track training records and medical device companies, in particular, are required to do so. Our medical device companies, as well as some other companies and other industries, have been using Arena for some time to track training records and connect these SOPs and policies and other required training materials. As a result, we’ve been spending a lot of time talking to our customers today and what they’re doing in this area, what else they would like to do with training record management, and why. So we felt we would give our customers even more business process support or expanding those PLM boundaries if you will.

So this year, in addition, what you can already do with training records in Arena, you will see a new module called Arena Training, which is our specific training record management module, and it will allow customers not only to do what they have been doing with the product but provide more process support, as well as analysis and ultimately more value for our customers. So, we’re very excited to build on what our customers have been doing and expand that further out.

Doug McCormick:

Thank you. And this question is for both of you, and I’m afraid it will have to be our last for today. At what stage do customers normally tackle product analytics?

Jeff Hojlo:

I’ll jump in there for us. This is Jeff. I think that when there’s a high degree of complexity across products, supply chain, and demand, there’s just this constant influx of information that doesn’t necessarily make it back to varying design and the broader product development team. So how do you react to that information that’s being generated? And I haven’t even started talking about connected products, where the promise of connected products that have software within them to track any failure modes or quality issues, but also performance and usage over time.

Well, those products are now producing a lot of information that can be leveraged by manufacturers. So manufacturers that are involved in producing connected products, which is really most manufacturers across the industry today, would probably be interested in an analytics tool. And I guess related to that, I talked about complex products means complex manufacturing processes, or you hear a lot about smart manufacturing or industry 4.0 as you hear about in Europe. It’s really the same thing. It’s really the ability to change quickly and be flexible in your manufacturing processes to address the changing demand.

And so you need to be able to track that those processes are actually meeting the as-designed requirements. Maybe one other point is when non-engineers, we talked about this extended view of PLM and how there’s really a broader scope of people involved in product lifecycle management, and who need access to information so that we’re talking about non-engineers, people who don’t necessarily design and engineer and develop the products but need certain points of information, that they help them to do their jobs and to launch products to market successfully. It could be sales marketing, could be executives who just want to track a process and see how products are performed, selling, and also performing. So anything, Heatherly, from your end?

Heatherly Bucher:

You’ve covered it really well. I think from Arena’s perspective, I mean we’re in complete agreement with the complexity of the products today, the connected products, the supply chain, the rapidity of needing to get products, multiple products out to the market quickly and control costs, all of that means that you can and should start tackling product analytics very rapidly in your lifecycle. You no longer need to wait until you have fully launched products and stabilized as a company before investing in product analytics, and the nice thing is that the technology, if you have the datasets there from the beginning and then the platforms and the tools available for analysis, coupled with knowing what risks to avoid and what question’s asked of your data, you should be able to jump on in product analytics almost as soon as you have data that you’re collecting and figure out what’s valuable to you so that you can base decisions not just on kind of gut or instinct or experience or one part of the product team, but everybody and the data that you’ve been collecting as well.

Doug McCormick:

Thank you. We are out of time. I’d like to thank both of you for a particularly interesting session. I would like to point out that we have a lot of questions in the queue, and that Heatherly and Jeff will be able to follow up with you after that, the most recent question that wants to know how to get in touch. If you’ll just enter that you’d like to be in touch with someone, there’s a question that will be in the queue and folks can follow up. And as we said earlier, the session will be archived in 24 hours at spectrumieee.org/webinars. All registrants will get an email reminder when it goes up. And please do note that you can get your registration, your PDH continuing education registration, at the URL here. It’s much easier to go to the page that you use to register for this site.

And with that, once again, thank you to Jeff Hojlo and Heatherly Bucher for a great talk. Our thanks to our sponsor, Arena. And, as always, special thanks to our audience for joining us today. We hope you found today’s event valuable and that you’ll return for future IEEE Spectrum webcasts. Have a great day.