Auteur/autrice : Frederic Zeller

PLM Architecture: A “tentative”

1. Introduction

Product Lifecycle Management (PLM) is a complex and multifaceted approach to managing the entire lifecycle of a product from inception, through engineering design and manufacture, to service and disposal. This document provides a detailed overview of PLM architecture, taking into account its business context, key concepts, and the inherent complexity of breakdown structures within PLM systems.

The goal of this overview is to offer a comprehensive understanding of PLM systems and their implementation in modern enterprise environments. By exploring the intricacies of PLM architecture, we aim to equip stakeholders with the knowledge necessary to effectively design, implement, and manage PLM systems that can handle the complexities of modern product development and lifecycle management.

2. Complexity of Breakdown Structures in PLM

Breakdown structures are the cornerstone of PLM systems and the primary source of their complexity. These hierarchical representations of product information serve as the backbone for organizing and managing data throughout a product’s lifecycle.

2.1 Product Breakdown Structure (PBS)

The Product Breakdown Structure (PBS), generally called BOM (Bill of Material)  is a hierarchical decomposition of a product into its constituent parts, assemblies, and sub-assemblies. It serves as a blueprint for understanding the product’s composition and the relationships between its components. The PBS:

  • Provides a clear view of product structure and composition
  • Facilitates component management and traceability
  • Depending on the product lifecycle processes, there are several possible breakdown structures, including  design, manufacturing, and maintenance

2.2 Related Breakdown Structures

While the PBS is central to PLM, several other breakdown structures contribute to the overall complexity:

  • Requirement Breakdown Structure: Organizes and manages product requirements
  • Functional Breakdown Structure: Represents the functional architecture of the product
  • Plant Breakdown Structure: Describes the layout and organization of manufacturing facilities
  • Manufacturing Process Breakdown Structure: Outlines the steps and resources needed for product manufacturing
  • Maintenance Breakdown Structure: Defines the maintenance activities and schedules for the product

2.3 Revision Management

Revision management tracks changes to components, assemblies, or entire products over time. It involves:

  • Maintaining relationships between revised components
  • Ensuring compatibility between different revision levels (notion of Form, Fit, Function or FFF)
  • Tracking the history of changes
  • Supporting impact analysis of proposed changes

2.4 Effectivities

Effectivity defines when a specific configuration or component is valid within a product structure. Types include:

  • Date effectivity: Based on calendar dates
  • Serial number effectivity: Based on product serial numbers
  • Lot effectivity: Based on manufacturing lots or batches

2.5 Options and Variants

  • Options: Selectable features or components that can be added to a base product
  • Variants: Distinct versions of a product with pre-defined sets of options

2.6 Interrelationships and Challenges

The complexity of PLM systems stems from the intricate relationships between these structures:

  • Cross-structure dependencies: Managing relationships between different breakdown structures
  • Configuration management challenges: Ensuring consistency across complex product structures
  • Data management challenges: Handling large volumes of product data across multiple domains

Breakdown structures are the key concept of PLM and the main source of complexity in these systems. They provide a structured way to represent and manage the various aspects of a product throughout its lifecycle. However, the multitude of interconnected breakdown structures, their hierarchical nature, and the need to manage changes and variations across these structures create significant complexity.

This complexity is further amplified by the need to maintain consistency across different domains, manage revisions and effectivities, and handle large volumes of data. Understanding and effectively managing these breakdown structures is crucial for successful PLM implementation and operation.

3. Key Concepts in PLM Architecture

To manage the complexity inherent in breakdown structures, PLM architecture relies on several key concepts:

3.1 Occurrence Effectivity

Occurrence Effectivity defines when and where a specific configuration or part is valid within a product structure. It enables management of product changes over time, allowing for:

  • Tracking of component usage across different product versions
  • Management of product variants and configurations
  • Efficient handling of product evolution and updates

3.2 Baseline

A Baseline is a snapshot of a product’s configuration at a specific point in time (or end product serial number). It serves as a stable reference point for development and analysis, providing:

  • A consistent view of the product configuration for all stakeholders
  • A foundation for change management and version control
  • A mechanism for tracking product evolution over time

3.3 Configuration Item (CI)

A Configuration Item is a fundamental unit of a product that is managed separately in the configuration management process. CIs allow for modular management of complex products by:

  • Enabling independent management of product components
  • Facilitating change control and version management at a granular level
  • Supporting traceability and impact analysis of changes

4. PLM Architecture Overview

To address the complexity of breakdown structures and implement the key concepts effectively, PLM architecture typically consists of the following core components:

4.1 Enterprise Data Manager (EntDM)

The EntDM serves as the central hub of the PLM architecture. Its responsibilities include:

  • Managing various Breakdown Structures (BS) across the product lifecycle
  • Handling transverse change and project management
  • Managing Configuration Item (CI) applicability (applicability = effectivity, Option/variant applicability rules)
  • Creating and distributing Baselines
  • Integrating Configuration Items from various domains

4.2 Team Data Managers (TDMs)

TDMs are domain-specific components responsible for:

  • Managing data and processes within specific business domains
  • Receiving (often partial) Baselines from the EntDM
  • Providing Configuration Items back to the EntDM
  • Managing design iterations within their domain
  • Embedding and managing domain-specific authoring tools

4.3 Authoring Tools

These are specialized software tools embedded within each TDM, including:

  • MCAD (Mechanical Computer-Aided Design)
  • ECAD (Electronic Computer-Aided Design)
  • FEM (Finite Element Method) tools
  • Plant Design software
  • Simulation tools

4.4 Workflow and Data Flow

  1. The EntDM creates a Baseline and distributes it to relevant TDMs.
  2. TDMs work on their domain-specific aspects using the provided Baseline.
  3. TDMs submit their Configuration Items back to the EntDM.
  4. The EntDM integrates these CIs into the overall product structure, defines their effectivities, and creates new Baselines.

5. Rationale for PLM Architecture

The architecture described above is designed to address the complexity of PLM systems while achieving several key objectives:

5.1 Key Objectives

  1. Simplify data exchange
  2. Reduce Configuration Management complexity
  3. Improve efficiency and reduce errors

5.2 Approach

  • Exchange only resolved (static) data structures between EntDM and TDMs
  • Centralize Configuration Management in the EntDM

5.3 Benefits

  1. Simplified Data Exchange
  2. Centralized Configuration Management
  3. Clear Separation of Concerns
  4. Improved Efficiency
  5. Enhanced Data Integrity
  6. Scalability

6. Strategies for Managing Complexity

Given the inherent complexity of PLM systems, several strategies can be employed to manage and mitigate this complexity:

6.1 Modular Design Approaches

  • Implement component-based architecture
  • Develop standardized data models and interfaces
  • Create abstraction layers to hide complexity

6.2 Advanced PLM Systems

  • Leverage intelligent data management techniques
  • Utilize cloud-based PLM solutions for scalability and accessibility
  • Implement robust integration platforms

6.3 Clear Governance Policies

  • Establish data governance frameworks
  • Define clear process governance
  • Ensure compliance management

6.4 Visualization Tools

  • Implement 3D visualization for product data
  • Utilize data visualization techniques for complex relationships
  • Employ process visualization tools for workflow management

6.5 Continuous Improvement and Adaptation

  • Conduct regular system audits
  • Adopt agile implementation methodologies
  • Implement knowledge management systems

7. Impact on Service-Oriented Architecture

The PLM architecture we’ve discussed has significant implications when implemented in a service-oriented architecture (SOA). This section explores these impacts, particularly focusing on the Enterprise Data Manager (EntDM) and Team Data Managers (TDMs).

7.1 Enterprise Data Manager (EntDM) in SOA

While the EntDM can be implemented as one or several applications delivering services, there are important considerations:

  • Shared Business Logic: Even when distributed across multiple applications, these services share the same business logic, typically implemented as a shared library.
  • Coherent Service Set: Despite potentially not being monolithic, the EntDM represents a coherent set of services. This coherence is crucial for maintaining consistency in data management and configuration control across the product lifecycle.
  • Single Editor Source: Typically, these services originate from a single software editor or vendor. This ensures consistency in implementation and updates.
  • Potential for Collaboration: The editor may choose to share the underlying shared library with partners or even competitors. This could foster ecosystem development but requires careful management of intellectual property and standardization.

7.2 Team Data Managers (TDMs) in SOA

Similar principles apply to the implementation of TDMs in a service-oriented architecture:

  • Domain-Specific Services: Each TDM can be composed of multiple services, but these are typically specific to a particular domain (e.g., Mechanical Design, Electrical Design).
  • Shared Domain Logic: Within each TDM, services share domain-specific business logic, often implemented as a shared library for that particular domain.
  • Vendor Specialization: Different TDMs might come from different vendors specializing in specific domains, but each TDM set is typically from a single vendor to ensure coherence within the domain.

7.3 Implications for PLM Architecture

This approach to implementing EntDM and TDMs in a service-oriented architecture has several implications:

  1. Scalability: Services can be scaled independently based on demand, potentially improving system performance and resource utilization.
  2. Modularity: The use of shared libraries within EntDM and TDMs promotes modularity, potentially making it easier to update or extend functionality.
  3. Consistency: Shared business logic ensures consistency in data handling and business rules application across services.
  4. Vendor Lock-in: The reliance on vendor-specific shared libraries can lead to a degree of vendor lock-in, particularly for the EntDM.
  5. Integration Challenges: While each set of services (EntDM or specific TDMs) is internally coherent, integration between different TDMs or between TDMs and the EntDM may present challenges, especially if they come from different vendors.
  6. Standardization Opportunities: If vendors are willing to share or standardize their shared libraries, it could lead to greater interoperability in the PLM ecosystem.
  7. Performance Considerations: While SOA offers benefits in terms of scalability and modularity, it may introduce performance overhead due to service communication. Careful design is needed to balance distribution and performance.

This service-oriented approach to PLM architecture allows for a flexible and scalable system while maintaining the coherence necessary for effective product lifecycle management. However, it also underscores the importance of careful vendor selection, system integration, and potential standardization efforts in the PLM industry.

8. Conclusion

PLM architecture is a complex domain that requires careful consideration of various factors including data management, process integration, and technological implementation. The breakdown structures that form the backbone of PLM systems are both the key to effectively managing product lifecycle and the primary source of complexity.

By understanding the intricacies of breakdown structures, implementing key PLM concepts, and adopting a well-designed architecture, organizations can create PLM systems that effectively manage this complexity. The move towards service-oriented architectures presents both opportunities and challenges, emphasizing the need for thoughtful design and implementation approaches.

Effective PLM implementation requires a balance between centralized control and domain-specific flexibility, as well as a commitment to continuous improvement and adaptation. By leveraging the strategies outlined for managing complexity, organizations can implement robust PLM systems that support their product lifecycle management needs, driving innovation, efficiency, and competitiveness in today’s rapidly evolving product development landscape.

PLM and MBOM – a first look

First, some terminology:

(Always useful, even if some terms are not used in the post)

Part : any component, sub-assembly, assembly, product that is referenced during the product or manufacturing design

Manufactured Part: This is the actual physical piece that has been fabricated according to the specifications provided in the design, in a Manufacturing Process. It involves the utilization of materials, machinery, and labor to transform a design into a tangible item that can be assembled or integrated into a larger system or product. Once produced, the manufactured part is stored in a stock, and extracted on demand to be consumed in a new Manufacturing Process . These parts are usually designed to meet specific requirements, dimensions, materials, or tolerances that are unique to the company’s products.

Design Part: Conversely, a design part is a conceptual or virtual representation of a component. It exists in the form of drawings, CAD models, or other design documents, containing all the specifications, dimensions, materials, and other information required to manufacture the part. The design part serves as the blueprint for creating the manufactured part.

Configured part: fully defined part, with no ambiguity in term of definition, unlike generic parts, which can for example carry options/variants.

Material (or Raw Material) : Raw materials are the unprocessed natural substances or basic elements used as the starting point in manufacturing and production.

Work-In-Progress (WIP): refers to the goods that are in various stages of the production process but have not yet been completed. These items have already incurred some labor, material, and overhead costs but are not yet finished products.
WIP is not managed in stock, thus, in most of the case, it does not have any part number. It is referenced as the result of a manufacturing operation.

Semi-Finished Parts : A semi-finished part is a component that has undergone some, but not all, of its manufacturing processes and is intermediate between raw materials and the final product. It may require further machining, shaping, or assembly to become a finished part suitable for use in a final product. A semi finished part passes through a stock, so, he has to have a part number.

Part Number : a part number, for a configured part, is an Id that defines a class of equivalence. Two objects that have the same part number are fully interchangeable. A part number is absolutely mandatory for each purchased or manufactured part that passes through a stock.

You cannot discuss about MBOM without knowing what Manufacturing Process Plan (MPP) is :

A Manufacturing Process Plan (MPP) is a detailed document or roadmap that outlines the steps, sequences, methods, tools, equipment, and standards required to transform raw materials and compoents into new manufatured parts.

Here’s what the Manufacturing Process Plan typically includes:

  1. Sequence of Operations: The MPP lays out the precise sequence of steps that must be followed in the production process, from the preparation and treatment of raw materials to assembly, finishing, and inspection.
  2. Workstations : workstations and machines on which manufacturing operations are performed
  3. Tools and Equipment: The plan specifies the tools, machinery, and equipment necessary for each stage of production, including their setup and calibration.
  4. Materials and Components: The MPP lists the raw materials, semi-finished parts, and other components required.
  5. Quality Standards and Inspection: Quality control measures, acceptance criteria, and inspection techniques are outlined to ensure that the finished product meets the required standards and specifications.
  6. Labor and Skills Requirements: The plan may describe the labor requirements, including the necessary skills, qualifications, and training needed for different stages of the process.
  7. Time and Cost Estimates: Many MPPs also include estimates of the time and cost associated with each step, aiding in scheduling, budgeting, and overall project management.
  8. Safety and Environmental Compliance: The MPP may also contain information regarding safety protocols, waste management, and compliance with environmental regulations.

Product Design versus Manufacturing Process Design

In the world of modern manufacturing, the development of a product is no longer an isolated endeavor. The conception of a product and its manufacturing process is a parallel undertaking, intricately woven together to ensure efficiency, cost-effectiveness, and innovation. Here’s a closer look at how this parallel design process unfolds:

1. Reuse of Existing Parts or Subassemblies

The initial stages of product design often involve an assessment of existing parts or subassemblies that might be reused or adapted. This not only saves time and resources but also leverages proven components to enhance reliability. Reutilizing existing parts requires a clear understanding of inventory, previous designs, and how these components can integrate with new products.

2. Selection of Component and Subassembly Sourcing Methods

Choosing the right sourcing methods for components and subassemblies is a simultaneous consideration with product design. This involves critical decisions like:

  • Purchasing: Sourcing ready-made components that meet the required specifications.
  • Subcontracting: Collaborating with specialized manufacturers to produce certain parts or assemblies.
  • Internal Manufacturing: Producing the components in-house, leveraging existing capabilities, and controlling quality.

This choice is closely tied to factors like cost, quality, lead time, and strategic alignment with the overall product objectives.

3. Defining the Manufacturing Process with Impact on Part Design

The manufacturing process’s conception is not an afterthought but an integral part of the overall product design. The process selected can significantly impact the design of the parts, influencing factors such as:

  • Material Selection: Choice of materials that align with manufacturing capabilities and product requirements.
  • Tolerance Levels: Defining the acceptable variations in dimensions that can be achieved within the chosen manufacturing methods.
  • Cost Constraints: Aligning the part design with cost-effective manufacturing techniques without compromising quality.
  • Sustainability Considerations: Incorporating sustainable practices in both design and manufacturing.

By considering the manufacturing process early in the design phase, it is possible to create products that are not only innovative but also manufacturable, cost-effective, and aligned with market needs.

Conclusion about MPP: A Symbiotic Relationship

The parallelism between product design and the conception of its manufacturing process reflects a symbiotic relationship where each aspect informs and shapes the other. This synergy fosters innovation, reduces time-to-market, and ensures that the final product is not only a realization of creative vision but also a practical and marketable commodity.

In essence, modern manufacturing demands a holistic approach, where design is not confined to the drawing board but extends into the very heart of production, encompassing aspects such as part reuse, sourcing strategies, and process considerations that resonate with the product’s purpose, market positioning, and value proposition.

In such a case, MBOM is not « strictly-speaking » created from an EBOM. MPP is designed in parallel of the EBOM, with strong accountability check to verify that the two structures are aligned (i.e. all the design components have an equivalence in the MPP structure). Then, MBOM is a filtering of the MPP

Understanding MBOM in PLM: A Connection Between Design and Production

The Manufacturing Bill of Materials (MBOM) is an indispensable element within Product Lifecycle Management (PLM), serving as a bridge between product design and actual manufacturing processes. Its role in defining the flow of materials and components used in production is crucial. Here’s an in-depth look at the role of MBOM within PLM:

1. MBOM as a Filtered View of Manufacturing Process Plan (MPP)

The definition of MBOM is quite simple: for a product, sub-assembly or single part, it involves filtering the process plan, detailing the components and raw materials taken from stocks and consumed in the manufacturing process.
Thus, intrinsically, the MBOM is second-level information relating to the process plan. It translates procedural guidelines into concrete requirements for materials and components.

Thus, the answer to the question « What does my MBOM look like for a given product? », comes down to the following questions

  • How do I manufacture my product?
  • Which raw materials and components do I have to source?
  • Which intermediate components pass through inventory/stock during the overall manufacturing process?

Of course, often, the MBOM structure is quite close to the EBOM structure.
But do not forget that they can be quite different. Think about the products delivered in kits.

2. Integration between PLM and ERP Systems

As soon as the manufacturing process is complex enough, it is worth creating the the process plan and MBOM within the PLM system, using all the Product Data Management and Manufacturing Process Simulation capabilities
Once created in the PLM, the MBOM is leveraged in the Enterprise Resource Planning (ERP) system, where it guides procurement, production scheduling, and inventory management.

In Summary: MBOM Describes Inventory Flows

The MBOM serves as a dynamic roadmap that describes the incoming and outgoing flows of stocks, components, and materials within the production process. Of course, it must be aligned with the EBOM (ie all the components of the eBOM must be ‘consumed’ in the MBOM) , taking into account several main differences. Just a few examples :

  • Components of purchased or subcontracted subassemblies do not appear in the MBOM.
  • Raw materials appear in the MBOM of manufactured parts
  • Components that are produced during the Manufacturing Process as WIP (i.e. that do not pass through a stock during the manufacturing process) do not appear in the MBOM, but their raw marerials do !
  • Some semi-finished parts can appear in the MBOM (for example, if some manufacturing operations are subcontracted)
  • You can have some specific manufacturing objets, like kits

To conclude

Often, a product is designed, and defined, in the way it will be manufactured. That’s why EBOM and MBOM are often close. But that shouldn’t make us forget that EBOMs and MBOMs serve two different purposes. EBOM defines the products, MBOM the supplys needed to build the product, and the flows between stocks and production lines.

And, for complex products this not solve one of the main questions of PLM :

  • How do we guarantee the alignment between Design and Manufacturing during the whole lifecycle of the product ?

This alignment, which is the foundation of digital thread relies, on a strong change management process, but also on technical functionnalities. To be developped in further posts !

My beliefs in term of PLM Implementation Methodology

Implementing a Product Lifecycle Management (PLM) solution involves a multi-disciplinary approach that requires various key roles, a strategic methodology, and meticulous execution to ensure that the solution effectively supports the organization’s objectives.

Here is an outline of what I recommand:

1. Key Roles: Solution Architect, Business Consultants, Functional Consultants, and Technical Engineers

The overall success of a PLM implementation largely relies on the synergy of various key roles, which include:

  • Solution Architects: They are responsible for guaranteeing the global coherency of the implemented solution. They provide a holistic view, making sure that all parts of the implementation align with the organization’s overall strategy and objectives. The Solution Architect translates business requirements into technology requirements (data model, configuration rules, and processes) and defines the overall PLM architecture.
  • Business Consultants: They bridge the gap between the organization’s operational needs and the technical solution. They understand the business operations in depth and help translate these requirements into actionable implementation strategies.
  • Functional Consultants: They are responsible for configuring the PLM software to meet the organization’s needs as outlined by the business consultants. They understand the software’s capabilities and limitations, and work towards creating an optimal setup.
  • Technical Engineers: They execute the implementation plan, handling software installation, integration, and support.

2. Hybrid Methodology: V Cycle and Agile

My belief is a hybrid of two proven approaches – the V Cycle and Agile.

The initial phase of the project utilizes the V Cycle method to establish the core model and the common foundations of the solution. This model includes defining and understanding the business requirements, designing the system architecture, and developing test strategies. This methodology helps ensure that we develop a system that is not only technically sound but also addresses all the operational needs of the business.
The main objective of this core model is to ensure the overall consistency of the solution, to guarantee overall digital continuity (i.e. digital thread).
During this phase, the use of a System Engineering approach can be particularly effective.

Once the core model is established, we switch to an Agile approach for implementing user features. This iterative method allows for continuous delivery of small, incremental changes based on user feedback and testing. Agile promotes flexibility, encourages collaboration, and helps manage changing priorities effectively, leading to an overall better fit solution for the organization.
During this phase, the Solution Architect continues to oversee the process, and update the core model, ensuring the overall coherence of the solution.

3. Minimizing Developments, Maximizing Parameterization

A fundamental principle of today’s PLM implementation methodologies is to limit, and better avoid, custom developments and instead maximize the use of the parameterization capabilities of the implemented software. This approach not only reduces the time and cost of implementation but also makes the system easier to maintain and upgrade.

Custom developments can introduce complexities into the system, making it difficult to upgrade or adapt to changing business needs. On the other hand, parameterization allows for flexibility and scalability, enabling the system to evolve with the organization. Hence, we strive to utilize the software’s existing capabilities to the fullest extent and only accept custom developments when absolutely necessary.

By following this structured approach, we can ensure a smooth, effective PLM implementation that is tailored to customer organization’s needs and easy to maintain in the long run.

4. Working in a Closed Loop

Each business or functional requirement should be benchmarked with the capabilities of the software. This process allows us to quickly identify and close any gaps, preferably without the need for custom development.
Whenever possible, gaps should be closed through parameterization or reformulation of the requirements. It’s important to note that requirements are often expressed as solutions and can often be reformulated in a more effective manner. This nevertheless requires a perfect understanding of the customer’s processes in order to be a credible force of proposal.

By adhering to this approach, such a PLM implementation methodology is designed to deliver an effective, sustainable solution that can evolve with customer’s organization and deliver maximum value from his investment.

PLM, LinkedIn and complexity

I am stunned by the average level of posts on the PLM on Linkedin.
I’m sorry but, in 2023 :

  • Part Number management (significant, non-significant) should no longer be a problem.
  • Revision management should no longer be a question.
  • Configuration management theory should no longer be a question.
  • Notions of EBOMs, MBOMs … should no longer be a question.

So why are there still problems on these topics?

I see 3 reasons:

  • The weight of habits, the conservatism of end users
  • PLM software capabilities
  • The complexity induced by these topics due to the complexity of the managed products

Conservatism of end users

I have been often stunned by the conservatism of the key users during PLM solution designs, especially when it is a second or third generation PLM project. People are aware that things must change, that alignment with standards means the disappearance of secondary functions, but they cannot get used to these ideas. To sum up : everything is allowed, nothing is possible
Integrator is pushed to implement expensive abd useseless function, or in worst cases, models that harm the general coherency of the solution.

Thus, customers find themselves requiring meaningful identifiers (with heavy developments to generate and control these identifiers), more or less wobbly revision management, capability to modify afterwords the class of an object. And, of course, the non respect of these requirements is a show-stopper, even if 95% of the companies do not need them…

PLM software capabalities

Whatever one may think, there are, in 2023, real differences, and real differatiators between the main PLM solutions, especially on two topics, essentialy related to product structure management:

  1. Occurrences, revision and effectivity management
  2. Continuity between the different structures : EBOM, MBOM, Manufacturing Process (routings).

The extended notion of occurrence (relative versus absolute occurrences), strong revision management rules, single structure management engine are key topics.

If these functions are not correctly implemented, the risks of more or less generic workarounds, or heavy specific developments, impacting the core of the solution are likely to prevail, with all the impacts on the maintenance and scalability of the solution.

Complexity

There is a refusal, which turns into a denial, of complexity among users and customer PLM managers

If you mix (and you have to mix !) digital thread between the differents product views (EBOM, MBOM, SBOM …), change management – implying revision management and traceability -, you quickly get something that seems very complex, even looking unmanageable by end users.

Sorry, guys, but even if the functions are well implemented in a PLM solution, configuration management (taken in its broadest sense) is, and will remain a complex topic. Insuring traceability between engineering and manufacturing views of a product is a complex task. To manage it, the PLM solution must be efficient, the PLM implementation must be rigorous, and users must be trained.