Is the evolution of business Software development principles parallel to the Manufacturing ones and is it predicting it? Interestingly, having a closer look at the history of manufacturing since the Industrial Revolution and comparing it to the past 20 years of Software development, we can find striking similarities.
The advances in manufacturing principles have been driven by both the improvement of the production techniques and the evolution of the customer needs. The first and second industrial revolutions were all about mass producing manufactured goods. In the beginning of the 20th century, Fordism brought a new radical change: thanks to the invention of the assembly line, the principle of standardization allowed to produce thousands of identical units of the Model T Ford. The cost cutting that this new technique represented democratized the access to the goods, but with a serious limitation, namely, only one single model was available, as the famous sentence from Ford underlined: “Any customer can have a car in any color that he wants so long as it is black”.
In software development, the same principle has been applied – the first adopted approach was completely monolithic. As defined by Gunnar Menzel in his article Microservices in cloud-based infrastructure, this one-size-fits-all model consists in developing a “software design pattern that includes all functional and non-functional features into one box”. What happened? As with the cars, customers started to say the particular word we hear every day in software development: BUT. “I like it but…” “it does almost what I want but…”. The “buts” in the early 1920s lead to the next step forward in car manufacturing: General Motors started to offer colors and models variety.
After the war, a new manufacturing game-changer happened with Toyota’s disruptive approach. Toyotism and the Lean manufacturing model allowed to put in place a series of principles among which the idea of continuous improvement was reigning. Some of the concepts of this manufacturing turnaround were adopted much later in Software development and described by Eric Ries in his book The Lean Startup. Toyota’s inspirational manufacturing model, years later, lead software companies to adopt agile, reiterative and highly responsive methodologies to better answer their customers’ requirements and all their “buts”.
The monolithic model was no longer acceptable as the architecture of the solutions needed a finer grain. Service-Oriented Architecture (SOA) started to replace the previous approach by “exposing discrete components of an application as web services” (G. Menzel). The multi-layered approach of SOA “slices up” the solution in multiple layers of services and therefore provides more flexibility.
We are now entering a new manufacturing era: The Smart Manufacturing or Industry 4.0 age. The customers’ demands changed, and manufacturing companies are now facing a new challenge: how to provide mass customized products to their clients? The same question can be asked of software development: how does one provide highly customized pieces of software that can also be supported, upgraded and can be made to evolve through their whole lifecycle? The answer resides in augmenting the granularity of the provided solutions. The microservices approach is the best answer to this question yet. As Menzel defines it, microservices are “independent application services delivering one single business capability in an independent, loosely connected and self-contained fashion”. In other words, every piece of the puzzle does one thing, does it well and is highly independent of the rest.
Getting back to the initial question: is the Manufacturing evolution a good indicator to predict the future of Software development? The principles of both sectors advance in parallel because the demand of customers evolves in the same direction, from mass standardization to mass customization. Mass customization of manufactured goods is only possible in a Smart Factory. Mass customization of software development is only possible thanks to one special highly performant platform, a Smart DevOps Software factory. We will follow up in a next article about this concept.
PLM (product Lifecycle Management) is a methodology used in Manufacturing in which the central concept is Lifecycle. Life is a cycle that starts from birth and continues with growth and finally death. For human beings, pregnancy is 9 months and birth a few hours (hopefully…). On the other hand, life expectancy at birth is about 100 times longer: latest estimates in developed countries are around 80 years! Carrying and delivering a child is a beautiful and painful moment of life but it is a moment: this is just the beginning of the story, just as the first software delivery is just the start.
Our customers wish to have beautiful and healthy pieces of software that adjust exactly to their Form, Fit and Function but more importantly, they need them to grow and mature with them. Our customers live and breathe PLM everyday as a central concept of their manufacturing business. We have learned from this concept and applied it to software development by creating a unique DevOps platform that allow us to be not only the prolific parents of hundreds of apps but also the caring parents that will accompany them all along their life journey.
What is the benefit?
Our customers make tractors, jewelry, spaceships, furniture… this is their job, not to produce and maintain software. The Wincom DevOps platform and the apps methodology allow them to have their Total Cost of Ownership (TCO) significantly lower, predictable and managed.
An important secondary effect is very interestingly pointed by John Papageorge in his post “What’s the real total Cost of Ownership of on-premise PLM software?”: the economic concept of “opportunity cost”. If you invest your money in in-house customized software solutions, you will not only get a poor ROI but you will also mobilize your resources on a task that is not your core business. The money you invested in support software, if invested in innovation in your core business, may have helped to make your company even more competitive in its industry and therefore increase your revenue: yet, somehow, you missed the opportunity.
A highly valuable lesson we learned from our manufacturing customers is that the birth is the first step of a long life(cycle) journey: this allowed us to be the parents of almost 300 software babies and more importantly support them as they grow up.
This post is the first of a series on how to apply manufacturing concepts to software developments.
I don’t know if you are much into fashion, but last year women jumpsuits were quite trendy and like pretty much every girl I know, I got one. Last week I wanted to wear it for a party, but due to too much good food over the holidays and a January lack of motivation to go the gym, it was a no go. I bitterly thought: I should have bought a pair of trousers, a blouse and a jacket as some of it would still probably fit!
As you are not reading Vogue but a blog about PLM, you are now thinking: what on earth is the connection with software? As a software company we are often given a list of features that a customer needs and they expect us to build a single solution to satisfy these requirements. The way this is normally done is a project is started and after several months of hard work, brainstorming, meetings, development, more meeting, testing, feedbacks, more meetings etc. you finally get a piece of software that more or less fits the original need.
However, the business, the requirements, the teams and related software systems never stop evolving and almost from day one, the delivered solution needs to evolve, enhance, extend and even some features need to be disregarded just to keep up; not to mention the inevitable upgrade of the PLM. That usually means more brainstorming, more meetings, testing, feedback. Very soon the total cost of ownership (TCO) starts to spiral out of control.
So… what if instead of purchasing this jumpsuit piece of software your requirement is divided into a series of apps that complement one another and, all together, provide you the perfect outfit? This solution has a lot of benefits:
– The apps implementation and deployment is much easier and therefore less costly.
– When you need to extend or enhance one aspect of your initial requirement, this can be done on just one app adding or changing this specific feature. This “surgical” approach also means a significant time and cost saving.
– If you no longer need part of the features you planned in the first place, the corresponding app can simply be uninstalled without touching any of the others.
– The result is a much more managed TCO and a big decrease of the associated ownership headaches
Apps are the trend in software, but to successfully deliver an app based solution means we must use a DevOps platform built using PLM ideas to make this a concept a reality and get us away from the monolithic spaghetti that most system turn into.
A fashion tip: Avoid jumpsuits and invest in more flexible solutions that can evolve with your shape and form… or your clothes will stay in the closet.
In your company you probably hear this type of question: “Where are my changes?”, “What is taking so long?” or “What is holding up the process?”… and many people saying basically “I don’t want to understand Windchill, but I need the information it holds”. That’s why we created the External App Platform where your management area, your services, your external partners or whoever you decide can actually see all this information without having to dig into the Windchill complexity to find it. As every customer has unique needs and requirements, we brand and configure the platform exactly to your needs. We can configure as many Platforms as you require: your executive management will not need to access the same information as your external partners!
This is why the External App Platform (EAP) is a Service-as-a-Product: You get exactly what you need, but it is supported and maintained as a product would be.
The EAP has different modules that you can plugin. The first one is Data View: it allows you to visualize the information you need about changes, state, etc. in graphs and tabs in a very user friendly interface. Every single column, row, type of graph is entirely configurable, and the best part is that you can export all this information in different formats (.csv, .xls, .pdf) and with our report engine even produce a hard-copy report:
Through our second module, Search, you can navigate easily through your structures:
Portal, (our third module) allows you to have instance access to CAD and Part data, and to the 3D visualization. More are in progress, such as a new Workflow Visualization module:
Finally we can make modules built-to-order, to match exactly what you might need.
Our EAP is currently in production in many large, international manufacturing companies from different fields as Agroindustry, Defense and Furnishing.
Interested? Request an online demo or an evaluation copy here! And please take a look at our App Center.
After 15 years working in PLM, we have seen that many PLM implementations are being delivered late and only after lots of pain for everyone involved and some projects fail to deliver anything at all. So why is PLM so hard to implement? Why are the projects almost always late? And how can we deliver a successful PLM on time?
Why projects are late and/or fail
Here are some common reasons why a PLM project is late or fails:
- Resources: getting the right ones with the right knowledge
- Unrealistic expectations of management and users
- Under-estimating the complexity of PLM
- It’s the fault of the software: we should have bought PTC, Dassault, Siemens etc.
- Delays of vendors to fix “critical” issues
- Technical problems that delay migration of data
There are many more but in essence it comes down to Project Management and resources, and in PLM the way we manage these things doesn’t seem to have changed in decades.
Let’s start at the beginning, scoping the project: how long will it take and how much will it cost?
Often, implementers are obsessed with counting man days. Most PLM projects are scoped in the same way, the implementer spends days or weeks producing a huge excel spreadsheet that has a large number of line items specifying each activity and the number of days that each will take. They add them up and triumphantly present a huge number of XXX days. The customer is shocked at the huge number and the next few weeks are spent trying to find “savings”. It is an archaic way of defining a project that assumes we know everything from the beginning. But how can this be right? How can we know what we don’t know before we even begin?
Over time, lots of project management ideas have come and gone to solve this failure to deliver PLM, such as Platforming, Critical chain, etc. which certainly improve things, but I propose we run our PLM projects as if the PLM team were a “Startup”.
“The Lean Startup” is one in a series of recent books that try to help entrepreneurs create and manage successful tech startups. The book has many ideas, but a key one is to get something to market as soon as possible so you can learn what you don’t know. The idea is not new, e.g. Agile software development methodology also promotes it, and ironically these methods basically reuse many ideas from Lean Manufacturing that the manufacturing industry itself invented. In “The Lean Startup”, this idea takes the form of a concept called the MVP, the Minimum Viable Product.
I propose we need to think about our PLM projects as the Minimum Viable PLM to make our projects a success.
What is a Minimum Viable PLM?
A Minimum Viable PLM is not a prototype, it is not a “bad” implementation, it is a production worthy system, but we have cut down the requirements to the core. Once it is in production, we rapidly move to improve the solution, fix issues and add features that make our PLM the fully featured system our users require.
So we need to learn from our users and fix our mistakes, but we also need to be brave and make decisions (even the wrong ones) to avoid analysis paralysis. It does require a responsive dynamic team to make it work,
So the question is “Could it work?”. Our experience tells us it does, as the following case study illustrates.
MVP Case study
A medium size French company that makes postal machines, implemented a successful PLM in 1 year, How?
A project manager who ruthlessly defined his MVP
An infrastructure that meant they could change their decisions, quickly and rapidly and learn from mistakes
A small highly focused team
A mechanism to release new software updates almost daily
Really good software people, a skilled implementation team (Accenture) and a fully engaged customer
At go live, our Minimum Viable PLM looked like the image below and was successfully used in production.
But that was not the end; two years after go live, it now looks like this:
Over time there have been issues but these have been rapidly resolved. More features are constantly being added and some will take more time such as a key goal to radically change the way products are designed using ETO (Engineer-to-Order); some things in PLM are just plain difficult.
MVP really works, we have seen it work and we know that even in projects that are traditionally scoped, it rapidly takes hold, by simply providing the tools to iterate quickly. However we must have the project management and technical expertise to support this concept and the bravery to make decisions that may be wrong.
Through our work in extending and customizing PLM we are constantly in conversation with customers and Value Added Resellers across the world about what companies are doing today, planning tomorrow and the direction that they are trying to take their PLM systems. In this article I will try to summarize the key day to day problems that concern our customers, namely efficiency, and the drivers that are pushing them longer term, namely innovation.
One key driver in this area is user interface efficiency; with the advent of sophisticated mass market websites like facebook, linkedin and salesforce, users expect and demand software that is easy to use. Many requests we get are simply to reduce the number of clicks required to perform an action. PLM vendors have responded to this and core product interfaces are improving, but some tasks are still mysteriously complicated. Customers all do things slightly differently and so specific user interfaces are in demand, often to improve the interaction with users that have frequent, repeated tasks to do, e.g., creating data, mass action such as printing or applying business validation rules in the user interface to avoid complex and time consuming workflow processes.
Efficiency is also required for inter-system communication: almost all companies have the classic PLM/ERP interface requirement but many wish to connect other systems as well. The Windchill Problem Report object can be used to start the change process, but users will no longer accept the need to switch systems and double-enter data. They want their issue tracking system to be integrated and “send” the request to the PLM. There is a frequent requirement to integrate software within the product development process and we have implemented a number of targeted solutions to address this. This integration is also a key selling point for vendors, as shown by PTC’s acquisition of ThingWorx and Integrity.
Technical publishing and distribution of CAD data is another active area where efficiency is a key point. For products to be shipped quickly, manuals need to be updated, translated and delivered with accurate information. The information needed is in the PLM and closer integration and improved connectivity to the document authoring software is important. For example, we recently streamlined the translation process of the manual of a company as this was seen as a key cost and delay for the product release process.
Another frequent requirement is to access product data in an easy way without entering the PLM, such as using custom portals. These are used by users from ERP or services to gain access to drawings and 3D models, often these portals need to be mobile compatible. As data is moving outside the PLM realm, protected IP has become a greater focus, with attention being paid to solutions such as watermarking.
So if we look at the underlying drivers of all of these requests it is that organizations are under pressure to collaborate better and to access and efficiently use the product data. PLM is becoming a truly enterprise tool and less of a CAD oriented work group manager. The PLM managers we speak with are swamped with new requirements, increasingly coming from outside the engineering department.
Being the first to get product to market is clearly vital, but as important is creating the “right” product. Whilst efficiency can be seen as a tactical improvement within the organization to create products faster and with improved quality, innovation is a strategic goal, to make better products. This is the driving force behind another set of requirements that we get.
The companies that we often work with see their competitive advantage coming from being able to be more reactive and nimble than their competitors. Their ultimate goal is to respond to their client’s needs and provide them the perfect product as quickly as possible. This is the holy grail of PLM and is known as mass customization, other related terms are something-to-order (Design, Engineer, and Assemble), options and variants, product configurator. Whatever the terminology used, the objective is the same, to create products that are tailored to the customer.
The problem is that mass customization is not easy, for many years PLM vendors have tried to put solutions into the market place and generally failed. The reason is that this is a problem that touches every part of the product design process, from the way the CAD engineer designs to the way the product is serviced. Most organizations know this is a puzzle that must be tackled piece by piece. The first piece that has a lot of attention at present is the Bill of Materials transformation process, for example eBoM to mBoM. For a surprisingly large number of clients, this is a painstakingly manual process, and therefore there is a lot of interest in products such as PTC’s MPMLink. We have a number of clients that have or are in the process of implementing it; the current version is somewhat “quirky” although the underlying principles of the product are considered impressive. A new version is due soon and is eagerly awaited. Other customers using Enovia are also talking about similar BoM transformation issues.
So in essence the long term goal is mass customization, and to achieve this, companies are focusing on better product definition and data transformation tools.
Product data is seen as more and more valuable and therefore the role of PLM in the organization is on the rise. PLM is being put under pressure to deliver a greater diversity of products faster and with better quality. Unfortunately this requires a much better automation of the product development process from cradle to grave. We still have a lot of work to do!