Auteur/autrice : Frederic ZELLER (Page 1 of 2)

PLM, Graph Theory, AI, Change Management, and Digital Thread

  1. Managing Information in PLM: The size and complexity of products, coupled with the scope of PLM (engineering, manufacturing, requirements, MBSE, quality management, etc.), mean that a PLM system must handle enormous quantities of information. This includes both business objects and the relationships between them, varying based on the granularity of the information.
  2. Graph Theory Application: Graph theory shoul be considered for this management but must be carefully studied and adapted. The PLM business objects are complex, polymorphic and subject to revisions, while the relationships are strongly typed, carry applicability information (like dates or batch numbers), and require interpretation for optimal navigation among objects.
  3. Digital Thread Management and Impact Analysis: Managing digital continuity and analyzing the impact of modifications necessitates a reasoned navigation through the graph of relationships. For graphs with tens of thousands of objects and hundreds of thousands of relationships, this task is nearly impossible for humans without powerful filters.
    • If filters are static, we revert to « classical » views (requirements, MBSE, engineering, manufacturing, quality), which diminishes the benefits of graph theory.
    • Dynamic filters, defined according to the context or type of change, enhance search efficiency and navigation. In such cases, AI should be strongly considered, as it can provide a holistic and comprehensive view of digital continuity and the impacts of modifications.

Conclusion:This is where artificial intelligence (AI) plays a crucial role. AI can provide a holistic and exhaustive perspective on digital continuity and the implications of modifications, something indispensable in complex PLM environments. The integration of AI in PLM, particularly in conjunction with graph theory, transforms what would otherwise be an overwhelming influx of data into a structured, navigable, and insightful resource. Graph theory alone, albeit useful, achieves its full potential only when complemented with advanced AI tools, making it an indispensable asset in modern PLM strategies and impact analyses.

The States (and Stages) of a Part in PLM: Navigating the Complex Landscape

In the realm of product development, the journey of a single part is a multifaceted expedition, marked by distinct stages and considerations. Let’s embark on a voyage through the various states of a part:

  1. Specifications:
    • It all begins with an idea or a need. The part is identified and defined by a set of specifications, requirements, and sometimes intricate schematics. This initial phase sets the groundwork for what the part needs to achieve and how it should fit into the overall product.
  2. Geometric Part:
    • As the concept takes shape, the part enters the realm of Computer-Aided Design (CAD). Here, its geometric form is meticulously crafted, defining its shape, size, and functionality. However, aspects like material and treatments remain open for further refinement.
      It is the stage of the CAD BOMof the product.
  3. Physical Part FFF (Form, Fit, Function):
    • In this phase, the part transforms into a comprehensive entity where all its essential characteristics, excluding aesthetics, are precisely defined. It’s about ensuring that the part works seamlessly, fits perfectly, and serves its intended function.
      Generally, the initial EBOM is built at this stage of the part state
  4. Physical Part FFFA (Form, Fit, Function, Aesthetic):
    • The final transformation occurs when aesthetics come into play. The part evolves into its complete form, with all characteristics, including color, texture, and appearance, meticulously defined. It becomes not just functional but also visually aligned with design requirements.
      This part is managed either in the enriched EBOM (often with Options/Variants), or in the product configurator
  5. Manufacturer Reference:
    • For commercial parts or those sourced from various suppliers, a critical step is assigning a manufacturer reference. This reference is the exact code corresponding to the fully defined FFFA part, complete with specifications.
  6. Manufacturer Reference in Storage (SAP’s « MATERIAL »):
    • In the domain of inventory and supply chain management, the manufacturer reference becomes crucial once more. It corresponds to the part’s identity in storage, a bridge between digital and physical realms, often managed within an ERP system like SAP.It is typically the part that appears in the MRP BOM.

These states are integral to efficient product management and supply chain coordination. However, it’s worth noting two important observations:

a) Vocabulary Gaps: Precise terminology to describe these states can sometimes be elusive, highlighting the complexity of managing parts.

b) Additional States: There are more states to explore, such as parametric parts and part families, often well-managed in CAD systems but posing intricate challenges in PLM.

As we navigate this complex landscape of part states, we appreciate that it’s not just about managing components but orchestrating a symphony of data, processes, and collaboration across PLM, ERP, and CAD systems. Join the conversation and share your insights into the evolving world of part management.

#ProductDevelopment #SupplyChain #PLM #ERP #Innovation

The Dual Perspectives of PLM: FFF vs. FFFA

In the realm of consumer products, which constitute a significant portion of industrial production, aesthetics plays a crucial role in product design. The three components of aesthetics are:

  • The shape of the object.
  • Its materials and surface finishes (for non-functional visible surfaces).
  • Its color(s), along with their characteristics (exact hue, glossiness, etc.).

The shapes of an object, typically defined by surfaces, are part of the basic product definition (FFF: Form, Fit, Function). Altering these surfaces directly impacts the physical characteristics of the object. However, this is not the case with surface finishes (for non-functional surfaces) and colors. Modifying them does not compromise the functional qualities of the object. For instance, mounting a green door on a red car might make it less aesthetically pleasing, but the car will still run just as well and remain just as reliable!

Therefore, we can assert that the « product » and its components can be viewed from two different perspectives in the vast majority of cases:

  • The FFF perspective (Form Fit Function), independent of aesthetic characteristics, aims to create a perfectly functional product. This approach must be meticulous; for example, if a part is painted, the paint must be considered generically.
  • The FFFA (Form Fit Function Aesthetic) perspective, in which the product and its components are represented in their various colors and finishes.

This distinction is significant, as it is relatively poorly managed by PLMs to date. Generally, the entire product study is conducted at the « FFF » stage, and its variation to the « FFFA » stage typically requires significant work in creating « real » references and product configurations.
There is even a lack of vocabulary to differenciate the parts at the 2 stages

Additionally, attention must be given to (very rare) cases where color can physically impact a component or product:

  • Changes in the characteristics of a plastic material due to the addition of a specific dye, as in the case of mass-dyed parts.
  • Physical changes due to color, such as increased absorption of solar rays causing greater heating when a part is black, for example.

In conclusion, recognizing and effectively managing these two perspectives, FFF and FFFA, is crucial for the holistic development and aesthetic customization of consumer products within the PLM framework.

A View of PLM Systems Architecture: Balancing Contradictions and Complexity

The evolving world of Product Lifecycle Management (PLM) is witnessing constant innovations. With products becoming increasingly complex, PLM systems are striving to streamline the product development process. As we delve into the architecture of PLM systems, we encounter both contradictions and multilayered complexities.

1) Contradictory Constraints:

  • A Holistic Approach: Modern PLM systems are shifting towards a holistic model, aiming to capture the entire essence of a product’s lifecycle from ideation to disposal.
  • Flexibility & Modularity: In parallel, there’s an emphasis on creating PLM systems that are both agile and modular. This modularity ensures they can swiftly adapt to changing requirements and incorporate new tools.
  • Deep Environmental Integration: PLM systems are increasingly becoming integrated into their surrounding environment. This entails a harmonious integration with authoring tools, diverse data sources, and interfaces with systems like ERP (Enterprise Resource Planning), MES (Manufacturing Execution Systems), and MRO (Maintenance, Repair, and Operations).

2) A 3-4 Tiered Logical Architecture:

  • Individual Work: This foundational layer is dedicated to integrating authoring tools. The prime objective here is to ensure that each user can access a tailored work environment suitable for their tasks. Furthermore, users should be able to elevate the outcomes of their individual tasks to the team level, ensuring data compatibility and coherence with the overall management system.
  • TDM (Team Data Management): This layer focuses on the operational processes at the team or department level. It supervises the management of working environments, consolidates individual contributions, and structures data in a format fit for the enterprise repository.
  • EDM (Enterprise Data Management): Serving as the enterprise’s central data repository, the EDM layer encompasses a broad spectrum of views, from initial requirements, system engineering, to physical design, manufacturing and even recycling stages. It’s the heart of the PLM, centralizing data and ensuring its integrity.
    While feeding the TDM layer with all essential data, creating ad hoc working environments, it also consolidates feedback information, defines its applicability and ensures a digital thread of continuity between different views. The goal here is to remain as “neutral” as possible, making it adaptable to different contexts.
  • Interface & Formatting Layer: This layer’s primary role is to fine-tune data and channel it efficiently to downstream applications like MES, ERP, and MRO.

The PLM system architecture must describe an environment of profound complexity. Each layer, while fulfilling a distinct function, can be subdivided into several interactive modules. These modules, in unison, manage various data streams, emphasizing the complex yet harmonious nature of PLM systems. Essentially, architecture should be balanced – finely adapting to contradictions and complex design elements.

The Dilution of a Critical Role: The Overuse of the Term ‘Solution Architect’ in PLM

In the field of Product Lifecycle Management (PLM), the term « Solution Architect » has been increasingly and recklessly tossed around, leading to an inevitable dilution of the role’s specialized nature. What once signified a seasoned professional skilled in PLM’s intricate terrains has now often been misattributed to application engineers, senior configurators, and functional consultants. The intent of this article is to clarify what truly makes a Solution Architect in PLM and why it’s more than just another title.

Misuse of the Title: A Loss in Specialization

It’s become alarmingly frequent for roles like application engineers and senior configurators to receive the title of ‘Solution Architect in PLM.’ While these roles are significant in their areas, they lack the comprehensive skills needed to warrant the title. This lack of specificity is not only confusing but also detrimental to the quality of PLM projects.

Rich Functional Depth and Module Interaction

A genuine Solution Architect is expected to have a profound understanding of the rich functional layers and how different modules interact in the leading PLM solutions. The professional should be well-versed in everything from Computer-Aided Design (CAD) to Material Requirements Planning (MRP), and understand how these modules communicate to provide a cohesive system.

The 10-Year Experience Benchmark

A decade of experience is often touted as the benchmark for Solution Architects in PLM, with a minimum of 4 years focused on the specific solution to be implemented. This is not mere credential inflation. Here’s why:

  1. Deep Expertise: Five years on a specific PLM solution allows for a profound understanding of its capabilities and limitations.
  2. Risk Mitigation: A seasoned architect is less likely to underestimate challenges and can better predict potential roadblocks.
  3. Strategic Insight: Ten years in the field enables a long-term strategic viewpoint, indispensable for the success of PLM projects.
  4. Change Management: Experienced architects are better at helping organizations navigate the change a new PLM system will bring.
  5. Tech Stack Knowledge: A decade in PLM ensures a comprehensive understanding of the evolving tech stack that surrounds these systems.

The Importance of Multiple Experiences

Having diverse experiences across different PLM projects, industries, or functions can enrich a Solution Architect’s ability to see the bigger picture and to adapt strategies to unique circumstances.

Mastery of PLM Integration in the IT Ecosystem

True Solution Architects are experts in:

  • Authoring Tool Integration: This includes CAD for mechanical, electrical, and electronic components, as well as analysis and simulation tools.
  • Enterprise System Interface: They should be skilled in interfacing with Manufacturing Execution Systems (MES), Enterprise Resource Planning (ERP), Maintenance, Repair, and Overhaul (MRO), and Industrial Internet of Things (IIoT).

Holistic Vision and Client Constraints

An effective Solution Architect in PLM should possess a holistic view of the entire product lifecycle and be well-equipped to consider client-specific constraints, such as migration challenges and change management necessities.

Conclusion

The role of a Solution Architect in PLM is highly nuanced and demands a unique blend of expertise, experience, and vision. As the complexity of PLM solutions continues to escalate, it’s crucial that the integrity of this essential role be preserved. Organizations should be cautious in title attribution and ensure that those who bear it fully meet the specialized criteria that truly define a Solution Architect in the realm of PLM.

PLM and BOM Management: A Deeper Look into Configuration Items and Effectivity Management

Today, I propose to discuss one significant but often misunderstood term in the realms of PLM, engineering, software development, and project management: Configuration Items (CIs) within  Bills of Materials (BOMs). With a focus on their importance for effectivity management, let’s re-examine these critical elements that lay the foundation for optimized performance and control over organizational assets.

What is a Configuration Item (CI)? A Closer Look

A Configuration Item (CI) is an individual component within a larger system that can be managed independently. However, it’s essential to note that, in most cases, a CI is a supplied subassembly whose configuration is managed by a provider.

Example: Imagine an automobile manufacturing process. The navigation system in the car could be considered a CI. It is supplied by a third-party vendor, and its internal working—say the software, the GPS module, etc.—are configured and controlled by that vendor.

CI: A ‘Grey Box’ in Effectivity Management

Effectivity management focuses on ensuring that the right configurations are used at the right time to meet certain conditions or requirements. A CI serves as a « grey box » in this context, representing a break in effectivity management. The product sees the CI as a single unit and doesn’t vare about its composition.

Why is this significant?

  1. Streamlined Complexity: When a CI is treated as a grey box, it allows the primary manufacturer to focus on the integration of the CI without getting entangled in its intricate details.
  2. Provider Expertise: The provider who manages the CI is often an expert in that specific subassembly, ensuring that it meets all performance and quality criteria.
  3. Simplified Compliance and Documentation: Since the configuration of the CI components is managed at the CI level and not the product level, it makes compliance with standards and regulations more straightforward.

Why a ‘Grey Box’ for Configuration Items?

The term « grey box » is often used in the context of Configuration Items (CIs) to describe a component whose internal workings are not entirely transparent to the end-user or primary manufacturer but are still somewhat understood or defined. Unlike a « black box, » where the internal components and activities are completely unknown, a « grey box » suggests that while the primary manufacturer may not manage or control the CI’s internals, some level of information is available.

  1. Provider’s Expertise: The vendor or provider who supplies the CI manages its internals. This management typically involves specialized knowledge that the primary manufacturer may not have, making it more effective for the CI to be a « grey box » from the manufacturer’s perspective.
  2. Simplified Integration: Treating the CI as a grey box allows the primary manufacturer to focus on how it fits and functions within the larger system, without getting bogged down by its internal complexity.
  3. Effectivity Management: Each CI has its own effectivity and Option/Variant rules managed by the provider. Therefore, the product sees the CI as a self-contained unit, making it easier to manage effectivity at the product level.
  4. Flexibility for Special Cases: The « grey box » approach allows for exceptions, such as spare parts management. Even though a CI is generally treated as a self-contained unit, there can be scenarios where one might need to manage its components separately. For example, if a sub-component of a CI needs to be replaced as a spare part, it can still be managed at that level without altering the rest of the CI. This would not be possible if the CI were a completely closed « black box. »

By treating CIs as « grey boxes, » manufacturers maintain a level of flexibility and control without delving too deep into complexities that can be more efficiently managed by the CI’s provider. It offers a balanced approach that can be advantageous in various manufacturing and project management scenarios.

The Role of CIs in Large Assembly BOMs

Bills of Materials (BOMs) list all the components, materials, and sub-assemblies necessary to manufacture a product. For large assembly BOMs, CIs become particularly crucial for a few key reasons:

  1. Modularity: Treating CIs as grey boxes allows for more manageable modules within the BOM.
  2. Cost and Time Efficiency: Understanding CIs in this manner helps in better budget control and allows for parallel development activities.
  3. Risk Containment: Should a problem arise in a CI, it can be addressed without derailing the larger project, as the CI is a self-contained unit managed by the provider.

Understanding the nuances of Configuration Items and their role in effectivity management and large assembly BOMs can be a game-changer. It can significantly impact how efficiently a project progresses and how effectively a product performs.

Your insights and experiences on this subject are welcome!

#PLM #ConfigurationItems #BOMs #EffectivityManagement #ProjectManagement #Engineering

Why PLM is a Misnomer. Why not PLD ?

Product Lifecycle Management (PLM) is a term that has become ubiquitous in the world of manufacturing, engineering, and product development. But is this term truly reflective of what it encapsulates? Let’s dig into it.

What is the So-Called PLM?

By its traditional definition, PLM encompasses the holistic approach of managing a product’s life, from conception and design to manufacturing and post-sale service. It involves the systemization of:

  • Product design and engineering.
  • Means and processes of manufacturing.
  • Means and processes for maintenance and repair.
  • Means and processes for recycling.

Who Actually Manages the Product Lifecycle?

If we dissect the entire lifecycle of a product, it’s clear that PLM alone doesn’t manage it in its entirety.

ERP:

Enterprise Resource Planning systems handle production planning, procurement, and launching production. From raw materials to finished products, the ERP system provides the overarching framework.

MES:

Manufacturing Execution Systems (MES) track and document the transformation of raw materials through to finished goods. MES ensures that the actual production is as efficient and defect-free as possible.

MRO:

Maintenance, Repair, and Operations (MRO) is focused on maintaining, repairing, and operating the production assets. It involves planning and tracking maintenance tasks, as well as repair operations to ensure the product’s efficient life after manufacturing.

IoT:

Internet of Things (IoT) technology offers real-time tracking and tracing of all information related to the manufacturing, usage, and operation of the product. This includes data acquisition from various sensors embedded in the product, which can be pivotal for lifecycle considerations such as preventive maintenance or upgrades.

Why PLM Should Really be Called PLD: Product Lifecycle Design

Considering these facts, the term « PLM » may be misleading. PLM primarily focuses on the design and development aspects of a product. It’s about enabling a collaborative environment where designs, requirements, and documentation can be centrally managed and accessed. It doesn’t really manage the « whole » lifecycle of a product, but rather sets the stage for other systems like ERP, MES, MRO, and IoT to play their parts. A more appropriate term could be PLD, or Product Lifecycle Design, as it highlights the aspect of design which is the core focus of what we traditionally call PLM.

Conclusion

Language is powerful, and the terms we use should accurately reflect what they represent. Is it time for the industry to reconsider the term PLM? Perhaps, adopting a term like PLD might align more closely with what these systems actually do. What do you think?

About PLM System Selection: Insights from Over Three Decades

Throughout my 35-year journey in the world of Product Lifecycle Management (PLM), I’ve been deeply involved in the selection phase of PLM systems, initially from the client’s perspective and soon after, from the vantage point of software publishers and integrators.

Time and again, I’ve been struck by the superficiality that tends to overshadow this pivotal decision-making process, even for projects of several tens of millions of euros/dollars.

Here are some key observations and insights:

  1. Internal Expertise: Often, a company’s internal teams lack the in-depth PLM expertise necessary for informed decision-making. Such a void leads to choices based on surface-level knowledge, missing the deeper intricacies of what makes a PLM system genuinely effective.
  2. The Consultant Dilemma: Many independent consultants, instead of providing unbiased advice, tread cautiously to avoid offending software publishers, fearing blacklisting. This conflict of interest invariably compromises the integrity of their recommendations.
  3. Use Cases: More Surface than Substance: The use cases presented usually lack depth and comprehensive understanding, leading to a disconnect between system capabilities and actual organizational needs.
  4. Overlooking PLM Theory: In the quest for immediate solutions, the foundational theory of PLM, the core models, and basic algorithms are often undervalued. A deeper appreciation of these fundamentals can significantly inform better system selection.
  5. The Misconception of Equivalence: A prevalent false belief is that all PLM solutions are created equal. However, in reality, there are significant disparities between them. Overlooking these distinctions can lead to settling for a system that doesn’t truly meet organizational needs.
  6. Technical Aspects: Often Undervalued: A PLM system’s technical foundation plays a monumental role in its efficacy. Yet, this facet often takes a back seat during the selection process, leading to potential pitfalls down the road.
  7. Political Choices: Sometimes, PLM system selection is swayed by internal politics rather than the system’s capabilities or the organization’s genuine needs. Such decisions can often lead to complications in the future, as a politically driven choice may not align with operational requirements.
  8. Overvaluing End-user and Middle Management Opinions: While it’s crucial to consider the perspectives of end-users and middle management, over-relying on their opinions can skew the selection process. This overemphasis can bolster superficial criteria over deep technical or functional considerations.
  9. Publishers’ Misrepresentations: Sadly, I’ve witnessed instances where software publishers, in their eagerness to secure a deal, provide responses that range from overly optimistic portrayals to outright misrepresentations.

Selecting a PLM system is not a decision to be made lightly. It requires a holistic approach that considers technical capabilities, aligns with organizational needs, and is backed by genuine expertise. As stakeholders in this process, it’s imperative that we challenge superficiality, advocate for thoroughness, and champion the value of deep knowledge.

What Makes a Great PLM? And who can help you know if it’s a great PLM?

What must be evaluated in a PLM?

Product Lifecycle Management (PLM) systems are pivotal for organizations to streamline their product development processes and manage product-related information throughout its lifecycle. A top-notch PLM system is the backbone of successful product management. Here’s what sets the best apart:

  1. User Experience: A user-friendly interface is paramount. It not only ensures ease of use but also accelerates adoption across teams. The smoother the user experience, the more seamless the product lifecycle management.
  2. Functional Scope: The breadth and depth of functionalities determine how effectively a PLM system can cater to diverse needs. Comprehensive features are essential for end-to-end product management.
  3. Core Data Model: It’s crucial to have an accurate, rich, and extensible data model. This foundation determines how well the PLM can handle complex product information and changes over time.
  4. Business Logic and Core Algorithms: Features such as a Single Product Structure Engine and Occurrence Management algorithms streamline the management of products, ensuring optimal efficiency and accuracy.
  5. Architecture:
    • Unified Architecture: Cohesion and global unity ensure data consistency and alignment across teams and regions.
    • Type & Flexibility: Whether 3-Tier or 4-Tier, the architecture’s design impacts adaptability. Key elements like independence of the user interface, the number of databases, and potential for dockerization play pivotal roles.
    • Connectivity : An optimal PLM offers a multitude of web services (see customization)
    • Security: an optimal PLM keeps ethernet ports requirements minimal for efficient integration, and places a premium on cybersecurity.
  6. Openness: The ability to seamlessly interact or interface with third-party software, authoring tools, partners, customers … is vital. Capability to exchange (in and out) complex data, with strong filtering and traceability becomes a major feature of efficient PLM.
    It enables flexibility and ensures that the PLM can evolve with emerging technologies and needs.
  7. Performance: Both server performance and communication performance are crucial. Optimized LAN/WAN communication (considering chatiness and packet optimization) ensures real-time and smooth data access and collaboration.
  8. Customization & Sustainability: The future is in adaptability. A superior PLM offers APIs, web services, and seamless integration with low/no-code platforms. But customization doesn’t compromise sustainability; stable APIs and consistent web services ensure longevity. And what about software upgrades in case of customized solutions ?
  9. Scalability: As businesses evolve, a top-tier PLM adapts, ensuring it grows in tandem with organizational needs.

In conclusion, while many factors contribute to a successful PLM system, these key elements form the cornerstone. Investing in a PLM that embodies these characteristics can significantly bolster an organization’s product management efficacy.

Who must be involved for the solution evaluation?

For each of the nine topics concerning a great PLM system, different stakeholders within the organization will have varying perspectives based on their roles and expertise. Here’s a breakdown of who is most able to evaluate each topic:

User Experience :

  • End User: Primary evaluator, as they interact directly with the interface and can best judge usability.
  • Business Expert: Can also provide feedback on how well the user experience aligns with business processes.

Functional Scope:

  • Business Expert: Best positioned to understand if all necessary business functionalities are present.
  • PLM Expert: Can assess how well the functionalities match industry standards and best practices.

Core Data Model:

  • PLM Expert: Understands the intricacies of data modeling in PLM systems and can evaluate its adequacy.
  • IT Expert: Can assess the technical aspects of the data model.

Business Logic and Algorithms:

  • Business Expert: Can evaluate if the algorithms align with business needs and processes.
  • PLM Expert: Understands industry standards for business logic in PLM systems.

Architecture:

  • IT Expert: Most capable of assessing the technical architecture and its coherence.
  • PLM Expert: Can evaluate how the architecture aligns with PLM best practices.

Openness:

  • IT Expert: Primary evaluator for technical integrations with third-party tools.
  • PLM Expert: Can assess the PLM’s compatibility with industry standards.

Performance:

  • IT Expert:Best positioned to evaluate server performance and communication protocols.
  • End User: Can provide feedback on real-time performance issues they encounter.

Sustainability and customization:

  • IT Expert: Evaluates the quality and stability of customization tools and their potential long-term integration and evolutions.
  • PLM Expert: Understands the long-term requirements for PLM systems in the industry.

Scalability:

  • IT Expert: Can assess the technical scalability of the system.
  • Management: Can provide insights on future growth plans and if the PLM can accommodate such growth.

In summary, while different profiles can evaluate multiple topics, certain profiles are better suited to assess specific areas due to their expertise and daily interaction with the PLM system. Collaborative evaluation involving various profiles will provide a comprehensive assessment of the PLM’s capabilities.

#PLM #ProductManagement #Technology

BOM or Product Structure ? How the concepts have converged.


The Bill of Materials (BOM) is a critical component in the realm of manufacturing, design, and product management. Over time, the term « BOM » has evolved in significance and is now often seen as synonymous with « Product Structure. » Let’s delve deeper into this idea.

Historical Context:

Traditionally, a Bill of Materials (BOM) was a list or document that specified the raw materials, parts, and components, along with their quantities, needed to manufacture a finished product. The BOM was essentially an ingredients list for manufacturing.

On the other hand, Product Structure described how a product was broken down into its constituent components, sub-assemblies, and assemblies. It was more of a hierarchical depiction, detailing how different parts fit into the overall product.

Convergence of Concepts:

  1. Complexity of Modern Products: As products have become more complex, so too has their documentation. It’s no longer enough to just list parts; manufacturers need to understand the relationships between parts, how they fit together, and the various dependencies. This necessitates a deep structural view of the product, blurring the lines between a mere list (BOM) and a detailed breakdown (Product Structure).
  2. Digital Evolution: With the rise of digital tools and Product Lifecycle Management (PLM) software, the BOM has evolved from a static list into a dynamic, multi-dimensional entity. Modern BOMs can now represent the hierarchical structure, variants, and configurations of a product, encompassing the essence of what was traditionally termed ‘Product Structure’.
  3. Holistic Product View: In today’s competitive market, a holistic view of the product is paramount. From design and engineering to manufacturing and after-sales support, understanding the product’s structure is critical. Thus, the BOM has expanded in scope to provide a 360-degree view of the product.
  4. Unified Terminology for Cross-Functional Collaboration: With multiple departments and teams (design, engineering, procurement, manufacturing, etc.) collaborating on a single product, a unified language is essential. Referring to the product’s structure as the BOM simplifies communication and ensures that everyone is on the same page.
  5. Lifecycle Management: Modern BOMs are not static. They change and evolve as the product moves through its lifecycle, reflecting design changes, substitutions, or adaptations based on feedback or supply chain dynamics. This dynamic nature aligns more with the concept of ‘Product Structure’, which inherently acknowledges the product’s evolving nature.

Conclusion:

The evolution of the BOM from a static list to a dynamic representation of a product’s entirety is a reflection of the complexities of modern manufacturing and product design. The convergence of the terms « BOM » and « Product Structure » is not just semantic; it mirrors the industry’s need for a more holistic, integrated, and detailed view of products. As products continue to evolve, it’s likely that our understanding and representation of their structure will evolve alongside them.

« Older posts