PLM Architecture: A “tentative”

1. Introduction

Product Lifecycle Management (PLM) is a complex and multifaceted approach to managing the entire lifecycle of a product from inception, through engineering design and manufacture, to service and disposal. This document provides a detailed overview of PLM architecture, taking into account its business context, key concepts, and the inherent complexity of breakdown structures within PLM systems.

The goal of this overview is to offer a comprehensive understanding of PLM systems and their implementation in modern enterprise environments. By exploring the intricacies of PLM architecture, we aim to equip stakeholders with the knowledge necessary to effectively design, implement, and manage PLM systems that can handle the complexities of modern product development and lifecycle management.

2. Complexity of Breakdown Structures in PLM

Breakdown structures are the cornerstone of PLM systems and the primary source of their complexity. These hierarchical representations of product information serve as the backbone for organizing and managing data throughout a product’s lifecycle.

2.1 Product Breakdown Structure (PBS)

The Product Breakdown Structure (PBS), generally called BOM (Bill of Material)  is a hierarchical decomposition of a product into its constituent parts, assemblies, and sub-assemblies. It serves as a blueprint for understanding the product’s composition and the relationships between its components. The PBS:

  • Provides a clear view of product structure and composition
  • Facilitates component management and traceability
  • Depending on the product lifecycle processes, there are several possible breakdown structures, including  design, manufacturing, and maintenance

2.2 Related Breakdown Structures

While the PBS is central to PLM, several other breakdown structures contribute to the overall complexity:

  • Requirement Breakdown Structure: Organizes and manages product requirements
  • Functional Breakdown Structure: Represents the functional architecture of the product
  • Plant Breakdown Structure: Describes the layout and organization of manufacturing facilities
  • Manufacturing Process Breakdown Structure: Outlines the steps and resources needed for product manufacturing
  • Maintenance Breakdown Structure: Defines the maintenance activities and schedules for the product

2.3 Revision Management

Revision management tracks changes to components, assemblies, or entire products over time. It involves:

  • Maintaining relationships between revised components
  • Ensuring compatibility between different revision levels (notion of Form, Fit, Function or FFF)
  • Tracking the history of changes
  • Supporting impact analysis of proposed changes

2.4 Effectivities

Effectivity defines when a specific configuration or component is valid within a product structure. Types include:

  • Date effectivity: Based on calendar dates
  • Serial number effectivity: Based on product serial numbers
  • Lot effectivity: Based on manufacturing lots or batches

2.5 Options and Variants

  • Options: Selectable features or components that can be added to a base product
  • Variants: Distinct versions of a product with pre-defined sets of options

2.6 Interrelationships and Challenges

The complexity of PLM systems stems from the intricate relationships between these structures:

  • Cross-structure dependencies: Managing relationships between different breakdown structures
  • Configuration management challenges: Ensuring consistency across complex product structures
  • Data management challenges: Handling large volumes of product data across multiple domains

Breakdown structures are the key concept of PLM and the main source of complexity in these systems. They provide a structured way to represent and manage the various aspects of a product throughout its lifecycle. However, the multitude of interconnected breakdown structures, their hierarchical nature, and the need to manage changes and variations across these structures create significant complexity.

This complexity is further amplified by the need to maintain consistency across different domains, manage revisions and effectivities, and handle large volumes of data. Understanding and effectively managing these breakdown structures is crucial for successful PLM implementation and operation.

3. Key Concepts in PLM Architecture

To manage the complexity inherent in breakdown structures, PLM architecture relies on several key concepts:

3.1 Occurrence Effectivity

Occurrence Effectivity defines when and where a specific configuration or part is valid within a product structure. It enables management of product changes over time, allowing for:

  • Tracking of component usage across different product versions
  • Management of product variants and configurations
  • Efficient handling of product evolution and updates

3.2 Baseline

A Baseline is a snapshot of a product’s configuration at a specific point in time (or end product serial number). It serves as a stable reference point for development and analysis, providing:

  • A consistent view of the product configuration for all stakeholders
  • A foundation for change management and version control
  • A mechanism for tracking product evolution over time

3.3 Configuration Item (CI)

A Configuration Item is a fundamental unit of a product that is managed separately in the configuration management process. CIs allow for modular management of complex products by:

  • Enabling independent management of product components
  • Facilitating change control and version management at a granular level
  • Supporting traceability and impact analysis of changes

4. PLM Architecture Overview

To address the complexity of breakdown structures and implement the key concepts effectively, PLM architecture typically consists of the following core components:

4.1 Enterprise Data Manager (EntDM)

The EntDM serves as the central hub of the PLM architecture. Its responsibilities include:

  • Managing various Breakdown Structures (BS) across the product lifecycle
  • Handling transverse change and project management
  • Managing Configuration Item (CI) applicability (applicability = effectivity, Option/variant applicability rules)
  • Creating and distributing Baselines
  • Integrating Configuration Items from various domains

4.2 Team Data Managers (TDMs)

TDMs are domain-specific components responsible for:

  • Managing data and processes within specific business domains
  • Receiving (often partial) Baselines from the EntDM
  • Providing Configuration Items back to the EntDM
  • Managing design iterations within their domain
  • Embedding and managing domain-specific authoring tools

4.3 Authoring Tools

These are specialized software tools embedded within each TDM, including:

  • MCAD (Mechanical Computer-Aided Design)
  • ECAD (Electronic Computer-Aided Design)
  • FEM (Finite Element Method) tools
  • Plant Design software
  • Simulation tools

4.4 Workflow and Data Flow

  1. The EntDM creates a Baseline and distributes it to relevant TDMs.
  2. TDMs work on their domain-specific aspects using the provided Baseline.
  3. TDMs submit their Configuration Items back to the EntDM.
  4. The EntDM integrates these CIs into the overall product structure, defines their effectivities, and creates new Baselines.

5. Rationale for PLM Architecture

The architecture described above is designed to address the complexity of PLM systems while achieving several key objectives:

5.1 Key Objectives

  1. Simplify data exchange
  2. Reduce Configuration Management complexity
  3. Improve efficiency and reduce errors

5.2 Approach

  • Exchange only resolved (static) data structures between EntDM and TDMs
  • Centralize Configuration Management in the EntDM

5.3 Benefits

  1. Simplified Data Exchange
  2. Centralized Configuration Management
  3. Clear Separation of Concerns
  4. Improved Efficiency
  5. Enhanced Data Integrity
  6. Scalability

6. Strategies for Managing Complexity

Given the inherent complexity of PLM systems, several strategies can be employed to manage and mitigate this complexity:

6.1 Modular Design Approaches

  • Implement component-based architecture
  • Develop standardized data models and interfaces
  • Create abstraction layers to hide complexity

6.2 Advanced PLM Systems

  • Leverage intelligent data management techniques
  • Utilize cloud-based PLM solutions for scalability and accessibility
  • Implement robust integration platforms

6.3 Clear Governance Policies

  • Establish data governance frameworks
  • Define clear process governance
  • Ensure compliance management

6.4 Visualization Tools

  • Implement 3D visualization for product data
  • Utilize data visualization techniques for complex relationships
  • Employ process visualization tools for workflow management

6.5 Continuous Improvement and Adaptation

  • Conduct regular system audits
  • Adopt agile implementation methodologies
  • Implement knowledge management systems

7. Impact on Service-Oriented Architecture

The PLM architecture we’ve discussed has significant implications when implemented in a service-oriented architecture (SOA). This section explores these impacts, particularly focusing on the Enterprise Data Manager (EntDM) and Team Data Managers (TDMs).

7.1 Enterprise Data Manager (EntDM) in SOA

While the EntDM can be implemented as one or several applications delivering services, there are important considerations:

  • Shared Business Logic: Even when distributed across multiple applications, these services share the same business logic, typically implemented as a shared library.
  • Coherent Service Set: Despite potentially not being monolithic, the EntDM represents a coherent set of services. This coherence is crucial for maintaining consistency in data management and configuration control across the product lifecycle.
  • Single Editor Source: Typically, these services originate from a single software editor or vendor. This ensures consistency in implementation and updates.
  • Potential for Collaboration: The editor may choose to share the underlying shared library with partners or even competitors. This could foster ecosystem development but requires careful management of intellectual property and standardization.

7.2 Team Data Managers (TDMs) in SOA

Similar principles apply to the implementation of TDMs in a service-oriented architecture:

  • Domain-Specific Services: Each TDM can be composed of multiple services, but these are typically specific to a particular domain (e.g., Mechanical Design, Electrical Design).
  • Shared Domain Logic: Within each TDM, services share domain-specific business logic, often implemented as a shared library for that particular domain.
  • Vendor Specialization: Different TDMs might come from different vendors specializing in specific domains, but each TDM set is typically from a single vendor to ensure coherence within the domain.

7.3 Implications for PLM Architecture

This approach to implementing EntDM and TDMs in a service-oriented architecture has several implications:

  1. Scalability: Services can be scaled independently based on demand, potentially improving system performance and resource utilization.
  2. Modularity: The use of shared libraries within EntDM and TDMs promotes modularity, potentially making it easier to update or extend functionality.
  3. Consistency: Shared business logic ensures consistency in data handling and business rules application across services.
  4. Vendor Lock-in: The reliance on vendor-specific shared libraries can lead to a degree of vendor lock-in, particularly for the EntDM.
  5. Integration Challenges: While each set of services (EntDM or specific TDMs) is internally coherent, integration between different TDMs or between TDMs and the EntDM may present challenges, especially if they come from different vendors.
  6. Standardization Opportunities: If vendors are willing to share or standardize their shared libraries, it could lead to greater interoperability in the PLM ecosystem.
  7. Performance Considerations: While SOA offers benefits in terms of scalability and modularity, it may introduce performance overhead due to service communication. Careful design is needed to balance distribution and performance.

This service-oriented approach to PLM architecture allows for a flexible and scalable system while maintaining the coherence necessary for effective product lifecycle management. However, it also underscores the importance of careful vendor selection, system integration, and potential standardization efforts in the PLM industry.

8. Conclusion

PLM architecture is a complex domain that requires careful consideration of various factors including data management, process integration, and technological implementation. The breakdown structures that form the backbone of PLM systems are both the key to effectively managing product lifecycle and the primary source of complexity.

By understanding the intricacies of breakdown structures, implementing key PLM concepts, and adopting a well-designed architecture, organizations can create PLM systems that effectively manage this complexity. The move towards service-oriented architectures presents both opportunities and challenges, emphasizing the need for thoughtful design and implementation approaches.

Effective PLM implementation requires a balance between centralized control and domain-specific flexibility, as well as a commitment to continuous improvement and adaptation. By leveraging the strategies outlined for managing complexity, organizations can implement robust PLM systems that support their product lifecycle management needs, driving innovation, efficiency, and competitiveness in today’s rapidly evolving product development landscape.

PLM, Graph Theory, AI, Change Management, and Digital Thread

  1. Managing Information in PLM: The size and complexity of products, coupled with the scope of PLM (engineering, manufacturing, requirements, MBSE, quality management, etc.), mean that a PLM system must handle enormous quantities of information. This includes both business objects and the relationships between them, varying based on the granularity of the information.
  2. Graph Theory Application: Graph theory shoul be considered for this management but must be carefully studied and adapted. The PLM business objects are complex, polymorphic and subject to revisions, while the relationships are strongly typed, carry applicability information (like dates or batch numbers), and require interpretation for optimal navigation among objects.
  3. Digital Thread Management and Impact Analysis: Managing digital continuity and analyzing the impact of modifications necessitates a reasoned navigation through the graph of relationships. For graphs with tens of thousands of objects and hundreds of thousands of relationships, this task is nearly impossible for humans without powerful filters.
    • If filters are static, we revert to « classical » views (requirements, MBSE, engineering, manufacturing, quality), which diminishes the benefits of graph theory.
    • Dynamic filters, defined according to the context or type of change, enhance search efficiency and navigation. In such cases, AI should be strongly considered, as it can provide a holistic and comprehensive view of digital continuity and the impacts of modifications.

Conclusion:This is where artificial intelligence (AI) plays a crucial role. AI can provide a holistic and exhaustive perspective on digital continuity and the implications of modifications, something indispensable in complex PLM environments. The integration of AI in PLM, particularly in conjunction with graph theory, transforms what would otherwise be an overwhelming influx of data into a structured, navigable, and insightful resource. Graph theory alone, albeit useful, achieves its full potential only when complemented with advanced AI tools, making it an indispensable asset in modern PLM strategies and impact analyses.

The States (and Stages) of a Part in PLM: Navigating the Complex Landscape

In the realm of product development, the journey of a single part is a multifaceted expedition, marked by distinct stages and considerations. Let’s embark on a voyage through the various states of a part:

  1. Specifications:
    • It all begins with an idea or a need. The part is identified and defined by a set of specifications, requirements, and sometimes intricate schematics. This initial phase sets the groundwork for what the part needs to achieve and how it should fit into the overall product.
  2. Geometric Part:
    • As the concept takes shape, the part enters the realm of Computer-Aided Design (CAD). Here, its geometric form is meticulously crafted, defining its shape, size, and functionality. However, aspects like material and treatments remain open for further refinement.
      It is the stage of the CAD BOMof the product.
  3. Physical Part FFF (Form, Fit, Function):
    • In this phase, the part transforms into a comprehensive entity where all its essential characteristics, excluding aesthetics, are precisely defined. It’s about ensuring that the part works seamlessly, fits perfectly, and serves its intended function.
      Generally, the initial EBOM is built at this stage of the part state
  4. Physical Part FFFA (Form, Fit, Function, Aesthetic):
    • The final transformation occurs when aesthetics come into play. The part evolves into its complete form, with all characteristics, including color, texture, and appearance, meticulously defined. It becomes not just functional but also visually aligned with design requirements.
      This part is managed either in the enriched EBOM (often with Options/Variants), or in the product configurator
  5. Manufacturer Reference:
    • For commercial parts or those sourced from various suppliers, a critical step is assigning a manufacturer reference. This reference is the exact code corresponding to the fully defined FFFA part, complete with specifications.
  6. Manufacturer Reference in Storage (SAP’s « MATERIAL »):
    • In the domain of inventory and supply chain management, the manufacturer reference becomes crucial once more. It corresponds to the part’s identity in storage, a bridge between digital and physical realms, often managed within an ERP system like SAP.It is typically the part that appears in the MRP BOM.

These states are integral to efficient product management and supply chain coordination. However, it’s worth noting two important observations:

a) Vocabulary Gaps: Precise terminology to describe these states can sometimes be elusive, highlighting the complexity of managing parts.

b) Additional States: There are more states to explore, such as parametric parts and part families, often well-managed in CAD systems but posing intricate challenges in PLM.

As we navigate this complex landscape of part states, we appreciate that it’s not just about managing components but orchestrating a symphony of data, processes, and collaboration across PLM, ERP, and CAD systems. Join the conversation and share your insights into the evolving world of part management.

#ProductDevelopment #SupplyChain #PLM #ERP #Innovation

The Dual Perspectives of PLM: FFF vs. FFFA

In the realm of consumer products, which constitute a significant portion of industrial production, aesthetics plays a crucial role in product design. The three components of aesthetics are:

  • The shape of the object.
  • Its materials and surface finishes (for non-functional visible surfaces).
  • Its color(s), along with their characteristics (exact hue, glossiness, etc.).

The shapes of an object, typically defined by surfaces, are part of the basic product definition (FFF: Form, Fit, Function). Altering these surfaces directly impacts the physical characteristics of the object. However, this is not the case with surface finishes (for non-functional surfaces) and colors. Modifying them does not compromise the functional qualities of the object. For instance, mounting a green door on a red car might make it less aesthetically pleasing, but the car will still run just as well and remain just as reliable!

Therefore, we can assert that the « product » and its components can be viewed from two different perspectives in the vast majority of cases:

  • The FFF perspective (Form Fit Function), independent of aesthetic characteristics, aims to create a perfectly functional product. This approach must be meticulous; for example, if a part is painted, the paint must be considered generically.
  • The FFFA (Form Fit Function Aesthetic) perspective, in which the product and its components are represented in their various colors and finishes.

This distinction is significant, as it is relatively poorly managed by PLMs to date. Generally, the entire product study is conducted at the « FFF » stage, and its variation to the « FFFA » stage typically requires significant work in creating « real » references and product configurations.
There is even a lack of vocabulary to differenciate the parts at the 2 stages

Additionally, attention must be given to (very rare) cases where color can physically impact a component or product:

  • Changes in the characteristics of a plastic material due to the addition of a specific dye, as in the case of mass-dyed parts.
  • Physical changes due to color, such as increased absorption of solar rays causing greater heating when a part is black, for example.

In conclusion, recognizing and effectively managing these two perspectives, FFF and FFFA, is crucial for the holistic development and aesthetic customization of consumer products within the PLM framework.

A View of PLM Systems Architecture: Balancing Contradictions and Complexity

The evolving world of Product Lifecycle Management (PLM) is witnessing constant innovations. With products becoming increasingly complex, PLM systems are striving to streamline the product development process. As we delve into the architecture of PLM systems, we encounter both contradictions and multilayered complexities.

1) Contradictory Constraints:

  • A Holistic Approach: Modern PLM systems are shifting towards a holistic model, aiming to capture the entire essence of a product’s lifecycle from ideation to disposal.
  • Flexibility & Modularity: In parallel, there’s an emphasis on creating PLM systems that are both agile and modular. This modularity ensures they can swiftly adapt to changing requirements and incorporate new tools.
  • Deep Environmental Integration: PLM systems are increasingly becoming integrated into their surrounding environment. This entails a harmonious integration with authoring tools, diverse data sources, and interfaces with systems like ERP (Enterprise Resource Planning), MES (Manufacturing Execution Systems), and MRO (Maintenance, Repair, and Operations).

2) A 3-4 Tiered Logical Architecture:

  • Individual Work: This foundational layer is dedicated to integrating authoring tools. The prime objective here is to ensure that each user can access a tailored work environment suitable for their tasks. Furthermore, users should be able to elevate the outcomes of their individual tasks to the team level, ensuring data compatibility and coherence with the overall management system.
  • TDM (Team Data Management): This layer focuses on the operational processes at the team or department level. It supervises the management of working environments, consolidates individual contributions, and structures data in a format fit for the enterprise repository.
  • EDM (Enterprise Data Management): Serving as the enterprise’s central data repository, the EDM layer encompasses a broad spectrum of views, from initial requirements, system engineering, to physical design, manufacturing and even recycling stages. It’s the heart of the PLM, centralizing data and ensuring its integrity.
    While feeding the TDM layer with all essential data, creating ad hoc working environments, it also consolidates feedback information, defines its applicability and ensures a digital thread of continuity between different views. The goal here is to remain as “neutral” as possible, making it adaptable to different contexts.
  • Interface & Formatting Layer: This layer’s primary role is to fine-tune data and channel it efficiently to downstream applications like MES, ERP, and MRO.

The PLM system architecture must describe an environment of profound complexity. Each layer, while fulfilling a distinct function, can be subdivided into several interactive modules. These modules, in unison, manage various data streams, emphasizing the complex yet harmonious nature of PLM systems. Essentially, architecture should be balanced – finely adapting to contradictions and complex design elements.

The Dilution of a Critical Role: The Overuse of the Term ‘Solution Architect’ in PLM

In the field of Product Lifecycle Management (PLM), the term « Solution Architect » has been increasingly and recklessly tossed around, leading to an inevitable dilution of the role’s specialized nature. What once signified a seasoned professional skilled in PLM’s intricate terrains has now often been misattributed to application engineers, senior configurators, and functional consultants. The intent of this article is to clarify what truly makes a Solution Architect in PLM and why it’s more than just another title.

Misuse of the Title: A Loss in Specialization

It’s become alarmingly frequent for roles like application engineers and senior configurators to receive the title of ‘Solution Architect in PLM.’ While these roles are significant in their areas, they lack the comprehensive skills needed to warrant the title. This lack of specificity is not only confusing but also detrimental to the quality of PLM projects.

Rich Functional Depth and Module Interaction

A genuine Solution Architect is expected to have a profound understanding of the rich functional layers and how different modules interact in the leading PLM solutions. The professional should be well-versed in everything from Computer-Aided Design (CAD) to Material Requirements Planning (MRP), and understand how these modules communicate to provide a cohesive system.

The 10-Year Experience Benchmark

A decade of experience is often touted as the benchmark for Solution Architects in PLM, with a minimum of 4 years focused on the specific solution to be implemented. This is not mere credential inflation. Here’s why:

  1. Deep Expertise: Five years on a specific PLM solution allows for a profound understanding of its capabilities and limitations.
  2. Risk Mitigation: A seasoned architect is less likely to underestimate challenges and can better predict potential roadblocks.
  3. Strategic Insight: Ten years in the field enables a long-term strategic viewpoint, indispensable for the success of PLM projects.
  4. Change Management: Experienced architects are better at helping organizations navigate the change a new PLM system will bring.
  5. Tech Stack Knowledge: A decade in PLM ensures a comprehensive understanding of the evolving tech stack that surrounds these systems.

The Importance of Multiple Experiences

Having diverse experiences across different PLM projects, industries, or functions can enrich a Solution Architect’s ability to see the bigger picture and to adapt strategies to unique circumstances.

Mastery of PLM Integration in the IT Ecosystem

True Solution Architects are experts in:

  • Authoring Tool Integration: This includes CAD for mechanical, electrical, and electronic components, as well as analysis and simulation tools.
  • Enterprise System Interface: They should be skilled in interfacing with Manufacturing Execution Systems (MES), Enterprise Resource Planning (ERP), Maintenance, Repair, and Overhaul (MRO), and Industrial Internet of Things (IIoT).

Holistic Vision and Client Constraints

An effective Solution Architect in PLM should possess a holistic view of the entire product lifecycle and be well-equipped to consider client-specific constraints, such as migration challenges and change management necessities.

Conclusion

The role of a Solution Architect in PLM is highly nuanced and demands a unique blend of expertise, experience, and vision. As the complexity of PLM solutions continues to escalate, it’s crucial that the integrity of this essential role be preserved. Organizations should be cautious in title attribution and ensure that those who bear it fully meet the specialized criteria that truly define a Solution Architect in the realm of PLM.

PLM and BOM Management: A Deeper Look into Configuration Items and Effectivity Management

Today, I propose to discuss one significant but often misunderstood term in the realms of PLM, engineering, software development, and project management: Configuration Items (CIs) within  Bills of Materials (BOMs). With a focus on their importance for effectivity management, let’s re-examine these critical elements that lay the foundation for optimized performance and control over organizational assets.

What is a Configuration Item (CI)? A Closer Look

A Configuration Item (CI) is an individual component within a larger system that can be managed independently. However, it’s essential to note that, in most cases, a CI is a supplied subassembly whose configuration is managed by a provider.

Example: Imagine an automobile manufacturing process. The navigation system in the car could be considered a CI. It is supplied by a third-party vendor, and its internal working—say the software, the GPS module, etc.—are configured and controlled by that vendor.

CI: A ‘Grey Box’ in Effectivity Management

Effectivity management focuses on ensuring that the right configurations are used at the right time to meet certain conditions or requirements. A CI serves as a « grey box » in this context, representing a break in effectivity management. The product sees the CI as a single unit and doesn’t vare about its composition.

Why is this significant?

  1. Streamlined Complexity: When a CI is treated as a grey box, it allows the primary manufacturer to focus on the integration of the CI without getting entangled in its intricate details.
  2. Provider Expertise: The provider who manages the CI is often an expert in that specific subassembly, ensuring that it meets all performance and quality criteria.
  3. Simplified Compliance and Documentation: Since the configuration of the CI components is managed at the CI level and not the product level, it makes compliance with standards and regulations more straightforward.

Why a ‘Grey Box’ for Configuration Items?

The term « grey box » is often used in the context of Configuration Items (CIs) to describe a component whose internal workings are not entirely transparent to the end-user or primary manufacturer but are still somewhat understood or defined. Unlike a « black box, » where the internal components and activities are completely unknown, a « grey box » suggests that while the primary manufacturer may not manage or control the CI’s internals, some level of information is available.

  1. Provider’s Expertise: The vendor or provider who supplies the CI manages its internals. This management typically involves specialized knowledge that the primary manufacturer may not have, making it more effective for the CI to be a « grey box » from the manufacturer’s perspective.
  2. Simplified Integration: Treating the CI as a grey box allows the primary manufacturer to focus on how it fits and functions within the larger system, without getting bogged down by its internal complexity.
  3. Effectivity Management: Each CI has its own effectivity and Option/Variant rules managed by the provider. Therefore, the product sees the CI as a self-contained unit, making it easier to manage effectivity at the product level.
  4. Flexibility for Special Cases: The « grey box » approach allows for exceptions, such as spare parts management. Even though a CI is generally treated as a self-contained unit, there can be scenarios where one might need to manage its components separately. For example, if a sub-component of a CI needs to be replaced as a spare part, it can still be managed at that level without altering the rest of the CI. This would not be possible if the CI were a completely closed « black box. »

By treating CIs as « grey boxes, » manufacturers maintain a level of flexibility and control without delving too deep into complexities that can be more efficiently managed by the CI’s provider. It offers a balanced approach that can be advantageous in various manufacturing and project management scenarios.

The Role of CIs in Large Assembly BOMs

Bills of Materials (BOMs) list all the components, materials, and sub-assemblies necessary to manufacture a product. For large assembly BOMs, CIs become particularly crucial for a few key reasons:

  1. Modularity: Treating CIs as grey boxes allows for more manageable modules within the BOM.
  2. Cost and Time Efficiency: Understanding CIs in this manner helps in better budget control and allows for parallel development activities.
  3. Risk Containment: Should a problem arise in a CI, it can be addressed without derailing the larger project, as the CI is a self-contained unit managed by the provider.

Understanding the nuances of Configuration Items and their role in effectivity management and large assembly BOMs can be a game-changer. It can significantly impact how efficiently a project progresses and how effectively a product performs.

Your insights and experiences on this subject are welcome!

#PLM #ConfigurationItems #BOMs #EffectivityManagement #ProjectManagement #Engineering

Why PLM is a Misnomer. Why not PLD ?

Product Lifecycle Management (PLM) is a term that has become ubiquitous in the world of manufacturing, engineering, and product development. But is this term truly reflective of what it encapsulates? Let’s dig into it.

What is the So-Called PLM?

By its traditional definition, PLM encompasses the holistic approach of managing a product’s life, from conception and design to manufacturing and post-sale service. It involves the systemization of:

  • Product design and engineering.
  • Means and processes of manufacturing.
  • Means and processes for maintenance and repair.
  • Means and processes for recycling.

Who Actually Manages the Product Lifecycle?

If we dissect the entire lifecycle of a product, it’s clear that PLM alone doesn’t manage it in its entirety.

ERP:

Enterprise Resource Planning systems handle production planning, procurement, and launching production. From raw materials to finished products, the ERP system provides the overarching framework.

MES:

Manufacturing Execution Systems (MES) track and document the transformation of raw materials through to finished goods. MES ensures that the actual production is as efficient and defect-free as possible.

MRO:

Maintenance, Repair, and Operations (MRO) is focused on maintaining, repairing, and operating the production assets. It involves planning and tracking maintenance tasks, as well as repair operations to ensure the product’s efficient life after manufacturing.

IoT:

Internet of Things (IoT) technology offers real-time tracking and tracing of all information related to the manufacturing, usage, and operation of the product. This includes data acquisition from various sensors embedded in the product, which can be pivotal for lifecycle considerations such as preventive maintenance or upgrades.

Why PLM Should Really be Called PLD: Product Lifecycle Design

Considering these facts, the term « PLM » may be misleading. PLM primarily focuses on the design and development aspects of a product. It’s about enabling a collaborative environment where designs, requirements, and documentation can be centrally managed and accessed. It doesn’t really manage the « whole » lifecycle of a product, but rather sets the stage for other systems like ERP, MES, MRO, and IoT to play their parts. A more appropriate term could be PLD, or Product Lifecycle Design, as it highlights the aspect of design which is the core focus of what we traditionally call PLM.

Conclusion

Language is powerful, and the terms we use should accurately reflect what they represent. Is it time for the industry to reconsider the term PLM? Perhaps, adopting a term like PLD might align more closely with what these systems actually do. What do you think?

About PLM System Selection: Insights from Over Three Decades

Throughout my 35-year journey in the world of Product Lifecycle Management (PLM), I’ve been deeply involved in the selection phase of PLM systems, initially from the client’s perspective and soon after, from the vantage point of software publishers and integrators.

Time and again, I’ve been struck by the superficiality that tends to overshadow this pivotal decision-making process, even for projects of several tens of millions of euros/dollars.

Here are some key observations and insights:

  1. Internal Expertise: Often, a company’s internal teams lack the in-depth PLM expertise necessary for informed decision-making. Such a void leads to choices based on surface-level knowledge, missing the deeper intricacies of what makes a PLM system genuinely effective.
  2. The Consultant Dilemma: Many independent consultants, instead of providing unbiased advice, tread cautiously to avoid offending software publishers, fearing blacklisting. This conflict of interest invariably compromises the integrity of their recommendations.
  3. Use Cases: More Surface than Substance: The use cases presented usually lack depth and comprehensive understanding, leading to a disconnect between system capabilities and actual organizational needs.
  4. Overlooking PLM Theory: In the quest for immediate solutions, the foundational theory of PLM, the core models, and basic algorithms are often undervalued. A deeper appreciation of these fundamentals can significantly inform better system selection.
  5. The Misconception of Equivalence: A prevalent false belief is that all PLM solutions are created equal. However, in reality, there are significant disparities between them. Overlooking these distinctions can lead to settling for a system that doesn’t truly meet organizational needs.
  6. Technical Aspects: Often Undervalued: A PLM system’s technical foundation plays a monumental role in its efficacy. Yet, this facet often takes a back seat during the selection process, leading to potential pitfalls down the road.
  7. Political Choices: Sometimes, PLM system selection is swayed by internal politics rather than the system’s capabilities or the organization’s genuine needs. Such decisions can often lead to complications in the future, as a politically driven choice may not align with operational requirements.
  8. Overvaluing End-user and Middle Management Opinions: While it’s crucial to consider the perspectives of end-users and middle management, over-relying on their opinions can skew the selection process. This overemphasis can bolster superficial criteria over deep technical or functional considerations.
  9. Publishers’ Misrepresentations: Sadly, I’ve witnessed instances where software publishers, in their eagerness to secure a deal, provide responses that range from overly optimistic portrayals to outright misrepresentations.

Selecting a PLM system is not a decision to be made lightly. It requires a holistic approach that considers technical capabilities, aligns with organizational needs, and is backed by genuine expertise. As stakeholders in this process, it’s imperative that we challenge superficiality, advocate for thoroughness, and champion the value of deep knowledge.

What Makes a Great PLM? And who can help you know if it’s a great PLM?

What must be evaluated in a PLM?

Product Lifecycle Management (PLM) systems are pivotal for organizations to streamline their product development processes and manage product-related information throughout its lifecycle. A top-notch PLM system is the backbone of successful product management. Here’s what sets the best apart:

  1. User Experience: A user-friendly interface is paramount. It not only ensures ease of use but also accelerates adoption across teams. The smoother the user experience, the more seamless the product lifecycle management.
  2. Functional Scope: The breadth and depth of functionalities determine how effectively a PLM system can cater to diverse needs. Comprehensive features are essential for end-to-end product management.
  3. Core Data Model: It’s crucial to have an accurate, rich, and extensible data model. This foundation determines how well the PLM can handle complex product information and changes over time.
  4. Business Logic and Core Algorithms: Features such as a Single Product Structure Engine and Occurrence Management algorithms streamline the management of products, ensuring optimal efficiency and accuracy.
  5. Architecture:
    • Unified Architecture: Cohesion and global unity ensure data consistency and alignment across teams and regions.
    • Type & Flexibility: Whether 3-Tier or 4-Tier, the architecture’s design impacts adaptability. Key elements like independence of the user interface, the number of databases, and potential for dockerization play pivotal roles.
    • Connectivity : An optimal PLM offers a multitude of web services (see customization)
    • Security: an optimal PLM keeps ethernet ports requirements minimal for efficient integration, and places a premium on cybersecurity.
  6. Openness: The ability to seamlessly interact or interface with third-party software, authoring tools, partners, customers … is vital. Capability to exchange (in and out) complex data, with strong filtering and traceability becomes a major feature of efficient PLM.
    It enables flexibility and ensures that the PLM can evolve with emerging technologies and needs.
  7. Performance: Both server performance and communication performance are crucial. Optimized LAN/WAN communication (considering chatiness and packet optimization) ensures real-time and smooth data access and collaboration.
  8. Customization & Sustainability: The future is in adaptability. A superior PLM offers APIs, web services, and seamless integration with low/no-code platforms. But customization doesn’t compromise sustainability; stable APIs and consistent web services ensure longevity. And what about software upgrades in case of customized solutions ?
  9. Scalability: As businesses evolve, a top-tier PLM adapts, ensuring it grows in tandem with organizational needs.

In conclusion, while many factors contribute to a successful PLM system, these key elements form the cornerstone. Investing in a PLM that embodies these characteristics can significantly bolster an organization’s product management efficacy.

Who must be involved for the solution evaluation?

For each of the nine topics concerning a great PLM system, different stakeholders within the organization will have varying perspectives based on their roles and expertise. Here’s a breakdown of who is most able to evaluate each topic:

User Experience :

  • End User: Primary evaluator, as they interact directly with the interface and can best judge usability.
  • Business Expert: Can also provide feedback on how well the user experience aligns with business processes.

Functional Scope:

  • Business Expert: Best positioned to understand if all necessary business functionalities are present.
  • PLM Expert: Can assess how well the functionalities match industry standards and best practices.

Core Data Model:

  • PLM Expert: Understands the intricacies of data modeling in PLM systems and can evaluate its adequacy.
  • IT Expert: Can assess the technical aspects of the data model.

Business Logic and Algorithms:

  • Business Expert: Can evaluate if the algorithms align with business needs and processes.
  • PLM Expert: Understands industry standards for business logic in PLM systems.

Architecture:

  • IT Expert: Most capable of assessing the technical architecture and its coherence.
  • PLM Expert: Can evaluate how the architecture aligns with PLM best practices.

Openness:

  • IT Expert: Primary evaluator for technical integrations with third-party tools.
  • PLM Expert: Can assess the PLM’s compatibility with industry standards.

Performance:

  • IT Expert:Best positioned to evaluate server performance and communication protocols.
  • End User: Can provide feedback on real-time performance issues they encounter.

Sustainability and customization:

  • IT Expert: Evaluates the quality and stability of customization tools and their potential long-term integration and evolutions.
  • PLM Expert: Understands the long-term requirements for PLM systems in the industry.

Scalability:

  • IT Expert: Can assess the technical scalability of the system.
  • Management: Can provide insights on future growth plans and if the PLM can accommodate such growth.

In summary, while different profiles can evaluate multiple topics, certain profiles are better suited to assess specific areas due to their expertise and daily interaction with the PLM system. Collaborative evaluation involving various profiles will provide a comprehensive assessment of the PLM’s capabilities.

#PLM #ProductManagement #Technology

« Older posts