Quantcast
Channel: SCN : Document List - SAP NetWeaver Master Data Management
Viewing all 38 articles
Browse latest View live

No more Java connections can be set up to MDS

$
0
0

Purpose:

This article talks about how to handle a connectivity problem from a front-end Java EE application to MDS.

 

 

Problem Overview:

You use CRM E-commerce, SRM MDM Catalog, MDM EC, or other Java EE applications, for which MDM Connector is used to set up the physical connection to MDS.

When you access any Java link, you get an error message saying:” HTTP 500, connection timed out”, or “page cannot be loaded”, although you have the backend repository up and running, and the same configuration has been working ok.

In the default trace, you find an error like:

Cannot create JCA connection. Cause exception: Cannot get connection for 120 seconds. Possible reasons: 1) Connections are cached within SystemThread(can be any server service or any code invoked within  SystemThread in the SAP J2EE Engine) or they are allocated from a non transactional ConnectionFactory. In case of transactional ConnectionFactory used from Application Thread there is an automatic mechanism which detects unclosed connections, 2)The pool size of adapter "MDM Factory" is not enough according to the current load of the system. In case 1) the solution is to check for cached connections using the Connector Service list_conns command or in case 2), to increase the size of the pool.

 

 

Solution:

As a system administration task, you can log into VA or NWA (depending on the java version) checking the current connection related parameters and decide if there is indeed a connection leakage problem:

UsedManagedConnectionsCount - Returns the count of currently used managed connections.

FreeManagedConnectionsCount - The count of all available managed connections.

ManagedConnectionsUsageRate - Ratio(percent) between the used and maximum managed connections number for the connection pool of a specific connection factory.

MaxManagedConnectionsNumber - Connection pool maximum size.

WaitingForManagedConnectionCount - The count of clients waiting for a managed connection.

OpenedManagedConnectionsCount - Shows how many managed connections are opened per application (Table).

TimedOutConnectionRequests - The aggregated number of failed getConnection() requests, due to a heavy connection pool usage.

SAP Note 1822443 talks about how to get these values.

 

To troubleshoot for possible solutions, follow the steps below:

1, SAP Note 1821867 is released for a known problem on accumulation of connections in java connection pool. The used connections are not released, connection count reaches the maximum value (default 1000), and no new connection can be set up. You may first check if this note applies.

2, if step 1 doesn’t solve the problem, then increase the value of parameters:

Maximum Waiting Time for Connections = 120sec (default) << the waiting time for connection open

Maximum Connections = 1000(default) <<< the pool size

for resource adapter "MDM Factory".

1.png

3, if you still have this issue after increasing these values, you can use the “switch_debug_mode” and “list_conns” telnet commands. Run “list_conns” at a regular basis, so that you can find what is leaking the connections before the pool gets exhausted.  More details are documented in SAP Note 719778.

 

 

Related Notes:

1822443 - Connection pool parameters on different Net Weaver versions

1821867 - Accumulation of MDM connections in connection pool

719778 – DataSource fails to return connection


External Enrichment - Master Data Enrichment Approach

$
0
0

 

Introduction

SAP MDM (Master Data Management) is a tool used for maintenance and management of Master Data for an Organization across geographies. In other words we can even explain MDM as an Enrichment Tool, which enriches the Business data in order to provide correct and reliable data to the Business for better Business decisions, to reduce fault resolution time, to speed up data processing and much more.

 

But still there are questions that needs to be answered for effective decision making by the Business which cannot be directly answered by SAP Master Data Management tool, however, this tool defines a baseline by defining an Integration Model that would help you to get the answers of Business specific questions before deciding on if this particular data needs to be further maintained in the central hub or not.

 

This article revolves around:

 

  • Concept of External Enrichment
  • How SAP MDM Integrates with Third Party Enrichment Tools
  • Key Benefits with this approach.

 

Need for External Enrichment

MDM always plays a vital role, when it comes to Reduction of Duplicate data or Mastering the Data for a Business with respect to there Master objects like Products, Vendors, Customers, Materials etc., but there are scenarios when the Business needs further Enrichment if not in terms of core master and transactional data but Data related to the Organization the Company is dealing with, for e.g.-

 

 

  • Will my customer pay me on time or will he pay at all?
  • What is the financial condition of my business partner? Are his financial statements running into red?
  • Will my supplier deliver on time?
  • Is my business partner reliable enough to business with?
  • Is my business partner involved in lawsuits or has any background that would hamper my business?

 

 

Getting answers to all these questions is what you call “Risk Management”. However, these questions cannot be directly answered by MDM, but SAP Master Data Management can be modeled in such a way that before storing any Master Data like Vendor, Material, Customer etc, it can ensure to get the complete information about the data that you are using and help the Organization to make efficient decision. These kinds of Enrichments are not possible using MDM, however, to manage risks effectively it is imperative to have all this data consolidated for a particular master data. But with a diverse system landscape and duplicate customers and vendors existing in various parts of organizations; assessing risk becomes a very big challenge. Inaccurate risk management would lead not only to bad financial decisions but also may hamper your relations with your vendors and customers.

 

 

External Enrichment

 

 

External enrichment services are provided either remotely by a service provider or by locally installed software. They typically offer very specialized and sophisticated enrichment functionality for certain kinds of master data.

 

There are various Service Providers in market with which an Organization can sign up a deal such as DnB, Acxiom, Intermat, Zycus and JP Morgan etc where you can send the Master Data related basic information and end – to – end information (which is accurate and clean) would be sent back by the Service providers. Some providers like DnB also maintain scores and ratios on various risk aspects which are calculated by them. Of course all of these wont come for free!

 

Consider the example of enriching customer address data. We have third party service providers like Dun & Bradstreet (DnB) which provide industry standard, accurate information on global entities. Given a small set of information like DUNS number, Name, City, Country etc DnB replies with a plethora of information like the World base Package containing the hierarchy of organization, its specific details like email, phone, website and other address details.

 

Using SAP MDM and Enrichment Controller provided by SAP, we can send the request as a web service to DnB and then enrich our repository records with information returned by DnB. This increases the data quality enormously! Errors such as typos, spelling, incomplete information etc do not take place as the data provided by DnB is standard and accurate.

 

Similarly other web services can be invoked from SAP MDM like VAT number, Latitude& Longitude etc. Use of such enrichment architecture can ensure high quality master data.

 

 

                                    Untitled.png

 

 

Enrichment Controller

 

Enriching your business partner master data with necessary information is the key to your Risk Management initiative. You can use a single third party data provider or opt for multiple providers to give you specific information on business partners. Requesting and gathering information is one part but most important is to use it effectively to make better decisions. Using the Enrichment Architecture, SAP MDM provides a Service Oriented Approach to easily connect and communicate with third party providers. Also now as all the data is consolidated with SAP MDM acting as a central hub, it facilitates better risk management.

 

All the steps of consolidating, cleansing and de-duplicating would come handy while enriching as you would now be able to ask for the exact Master Data related information and your request details would be more accurate. This would make it simpler for the 3rd party providers to send you the required details.

 

 

                                New Picture1.png

As shown above, SAP MDM comes with an Enrichment Controller which helps maintain communication with 3rd party service providers. The steps to be followed are:

 

 

  • SAP MDM syndicates XML file to Enrichment Controller.
  • Enrichment Controller will forward the file to 3rd party service provider.
  • 3rd party service provider will analyze the data and send the information requested.
  • SAP MDM will import the XML file using Import Manager and update data of the specific Master data object.

Conclusion

Using the Enrichment at Organizational level helps the Master Data Steward to get the Correct and Accurate data without any discrepancies.

 

Some of the key business benefits of using External Enrichment are:

 

  • Single face of business partner; irrespective of whether it’s a customer or vendor.
  • Reduction of Fault Resolution time.
  • Reduction of Overhead cost due to incorrect data.
  • Efficient management due to enriched master data and hence better decision making.
  • Compliance Adherence as a result of avoidance of “unwanted business partner relations”.
  • Accurate reporting by BI systems due to consolidated business partner single source of data.
  • Increase in business revenue and better relationships with your customers/vendors by analyzing them as business partners across geographies and systems thereby able to offer them better financial rates and terms.

Getting Started with SAP NetWeaver Master Data Management

$
0
0

Today, large and mid-scale companies operating on the base of diversified IT landscapes often suffer from master data that is inconsistently stored in multiple, disconnected systems or databases. Unmanaged master data is notoriously inaccurate, redundant, and full of discrepancies, all of which result in high maintenance costs, vulnerable business processes and poor business decisions.

SAP NetWeaver Master Data Management (MDM) enables companies to consolidate and harmonize their master data within heterogeneous IT landscapes. It consistently delivers vastly reduced data maintenance costs, ensures cross-system data consistency, accelerates the execution of business processes, and greatly improves decision-making.

 

 

Overview

SAP NetWeaver Master Data Management Solution Brief

This solution brief provides an overview of the master data management features and processes of SAP NetWeaver Master Data Management.

 

SAP NetWeaver Master Data Management: Animated Overview   (Time 00:02:30) 

An animated flash presentation providing an overview of SAP NetWeaver Master Data Management and its usage scenarios.

 

Watch MDM and BPM in Action   (Time 00:05:45) 

This animated demo shows how to flexible compose a governed data creation process based on SAP NetWeaver BPM and SAP NetWeaver MDM, and why this is beneficial in heterogeneous landscapes.

 

Getting Started with MDM Guides  

Release 5.5  |  Release 7.1

This document gives a high-level overview of SAP NetWeaver Master Data Management (MDM), including its functional components and main features. A tutorial specifically dedicated to new MDM customers provides a guided tour through the product and allows the readers to get hands-on experience.

MDM Tutorial Sample Data 

 

Release 5.5Release 7.1

This package contains the sample data required to go through the step-by-step tutorial for new MDM customers:

 

SAP NetWeaver MDM Overview   

This presentation provides an overview of SAP NetWeaver MDM including main capabilities, applicable scenarios, and customer examples. For deeper insight into the master data integration, master data operation and master data quality capabilities, see the level 2 presention.

 

SAP NetWeaver Master Data Management 7.1   

Get information on the current release, SAP NetWeaver MDM 7.1.

 

More Information

Misaligned Master Data is a Compromised Corporate Asset  

Read this SAPInsider article to get a comprehensive overview of SAP NetWeaver Master Data Management in a business context.

SAP MDM Global Data Synchronization

$
0
0

Collaborative business Based on Accurate and Consistent Data

SAP MDM GDS

SAP NetWeaver Master Data Management enables global data synchronization (GDS), supporting product data consistency, quality, and distribution between trading partners.

Automated and workflow guided, it lets you manage data and communication in an integrated central console for better supply chain operations according to Global Data Synchronization Network (GDSN) terms.

 

Getting Started with Global Data Synchronization

Global Data Synchronization - Solution Brief 

The Global Data Synchronization Solution Brief provides a high-level overview of this SAP NetWeaver MDM based business scenario.

 

Overview of SAP GDS 2.1  

Presentation outlining the features and enhancements of Global Data Synchronization, release 2.1.

 

Colgate-Palmolive on the Benefits of Global Data Synchronization   (Time 03:55)

Colgate-Palmolive speaks about the benefits of Global Data Synchronization with SAP NetWeaver MDM.

 

FEATURED EVENTS   (Time 38:00)

A tutorial eLearning session providing an overview of Global Data Synchronization with SAP NetWeaver MDM (based on the GDS 1.0 release).

 

More on Global Data Synchronization

Industry Standards Drive Business Network Transformation   

Industry standards bodies such as GS1 and RosettaNet are essential for companies to transform their business networks to support business processes that span these networks using disparate applications. SAP is closely aligned with these standards organizations and, along with our partners, has adopted these standards throughout our solution offerings.

 

Functional Documentation  

SAP Help Portal documentation describing features and functions of the current release, Global Data Synchronization 2.1.

 

Technical Documentation  (SMP Login Required)

Technical Guides about the current release, Global Data Synchronization 2.1.

Featured Content in SAP NetWeaver Master Data Management

$
0
0

What's New with Support Package 11 for SAP NetWeaver MDM 7.1

Find out about the enhancements coming with SAP NetWeaver MDM 7.1 SP 11. 03 January 2014

 

Join Customer Survey on SAP NetWeaver Master Data Management

Please help to continuously enhance SAP NetWeaver Master Data Management and take part in the current customer survey. 03 December 2013

 

Combining SAP NetWeaver MDM with SAP Lumira

Read this blog to find out how you can visualize master data information for analytical purposes using SAP Lumira. 20 November 2013

 

Setup communication between Two MDM Repositories.

$
0
0

In SAP MDM when we create multiple repositories in landscape there is no standard way through which we can make data exchange possible between two repositories

 

MDM 1.jpg

 

There is no standard approach to meet this requirement, but this can be achieved.

 

In our project there was a similar requirement where we have to make sub tables of two different repositories communicate with each other i.e. if either of the sub-table is updated or new record is created the changes should reflect is sub-table of other repository too.

 

Now the approach goes like this, we would create an inbound and outbound port for required repositories, we would use an UNIX scheduler (Since our MDM server is installed on Unix System) to run a script which will move xml files (updated Data) from outbound ready folder of one repository to the inbound ready folder other repository.

 

 

MDM 2.jpg

 

Using this approach we were able to pass data from one repository to another.



SRM-MDM Catalog Implementation – End to End Procedure

$
0
0

SRM-MDM Catalog Implementation – End to End Procedure

 

 

Applies to:

Standard SAP MDM 7.1 Console, SAP SRM 7.0, SRM–MDM catalog 3.0 SP11, SRM-MDM catalog content 3.0, MDM SRM UI App. For more information, visit the Master Data Management homepage.


Summary

The main purpose of this article is to show how to integrate SRM-MDM Catalog. A basic step by step guide to implement SRM-MDM Catalog.

 

Author:        Nitish Sharma

Company:    Mahindra Satyam

Created on:  21 june 2012


Author Bio

1.png 

 

Nitish Sharma has been associated with Mahindra Satyam for 15 months and has been a part of MDM practice. He is skilled in SAP MDM, SAP MDG, SRM MDM, Java, J2EE and completed his Bachelor’s degree in Computer Science.

                                                                                                

 

 

 

Table of Contents

 

1) Background

2) Prerequisites

3) MDM configuration

            Creating a mask

            Assigning records to mask

4) UI configuration

5) SRM configuration

6) Conclusion

7) Related Content

8) Copyright

 

1) Background


The purpose of this document is to show connectivity for SRM-MDM Catalog, creation of catalogs and creation of OCI mappings. Relevant screenshots are provided that explain the whole end to end procedure to set up SRM-MDM Catalog content.

 

  2) Prerequisites

 

System Type

Service

Pack Purpose

SAP MDM 7.1 Console

7.1.07.263

To setup SRMMDM Catalog repository

SAP SRM 7.0

SP9

To setup Catalog ID & call structure

SRM–MDM catalog 3.0 SP11

SP11

Catalog Repository

SRM-MDM catalog content 3.0

 

SRM MDM catalog content deployment

MDM SRM UI App

 

To configure OCI structure & search engine

 

  3) MDM configuration


Creating a mask

 

25.png


Assigning records to mask

26.png

 

  4) UI configuration

 

Enter all the details to login

 

4.png

 

Every user is assigned to a repository, select the user as maintained in Webservice.

 

5.png

The Configuration view shows the user specific configuration of SRM MDM Catalog. In the General pane, you can set Shopping options, Item Display Criteria, Search, Shopping Lists and OCI Mapping.

 

6.png

7.png

 

The OCI Mapping is shown in below screenshot.

8.png

9.png

Here under Customize Display->item lists, we select the fields which we want to deal with on the portal. Just select field and click on add.

10.png

For example

11.png

 

Here under Customize Display->Cart preview, we can check the preview of previous step.

 

12.png

 

Here under Customize Display-> compare, we can compare the positions of fields of selected fields in previous step with SRM-MDM repository fields. On the right hand side using up and down button we can adjust the positions of fields.

 

13.png

Under customize search tab, you can specify that on which fields search can be performed, and what should be the conditions like word “starts with n or m” etc.

 

14.png

15.png

For example

 

16.png

17.png

 

 

5) SRM configuration

 

Go to transaction SPRO(it requires SRM side authorization first to log in SRM)

 

18.png

Click on SAP Refrence IMG

19.png

 

Click on SAP Implementation->SAP SRM-> SRM Server->Master Data->Content Management-> Define external web services.

20.png

 

Enter the catalog web service id and description for the mask created in SAP MDM

 

 

21.png

After creating webservice id go to Standard call structure to assign relevant parameters which include repository name (to which catalog belongs), catalog name (mask name).

22.png

 

After assigning all the parameters now we have to assign user id to particular catalog. Go to pposa_bbp tcode. Click on select attribute, then ‘+’ symbol to add user id and catalog

24.png

 

  6) Conclusion

With the implementation of above procedure, now we can able to access catalog (mask created in mdm) on the portal, the id created in the last step can only access assigned catalog. The fields which are selected in UI implementation step will be visible in portal. The name of catalog (webservice id) given in webservice under standard call structure will also be visible. So, assigned user can now access the catalog and can make a shopping cart with the items assigned to particular catalog.

 

7) Related Content

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/806f3d56-0d29-2d10-9abf-c0df6f0fdfe8?quicklink=index&overridelayout=true

http://help.sap.com/saphelp_srmmdm10/helpdata/en/44/ec6f42f6e341aae10000000a114a6b/frameset.htm

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/b0940876-42fe-2d10-77be-a82aaa163e13?QuickLink=index&overridelayout=true

http://213.41.80.15/SAP_ELearning/OKEC/nav/content/011000358700000229332007E.PDF

 

 

8) Copyright

 

© Copyright 2012 SAP AG. All rights reserved.

No part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP AG. The information contained herein may be changed without prior notice.

Some software products marketed by SAP AG and its distributors contain proprietary software components of other software vendors.

Microsoft, Windows, Excel, Outlook, and PowerPoint are registered trademarks of Microsoft Corporation.

IBM, DB2, DB2 Universal Database, System i, System i5, System p, System p5, System x, System z, System z10, System z9, z10, z9, iSeries, pSeries, xSeries, zSeries, eServer, z/VM, z/OS, i5/OS, S/390, OS/390, OS/400, AS/400, S/390 Parallel Enterprise Server, PowerVM, Power Architecture, POWER6+, POWER6, POWER5+, POWER5, POWER, OpenPower, PowerPC, BatchPipes, BladeCenter, System Storage, GPFS, HACMP, RETAIN, DB2 Connect, RACF, Redbooks, OS/2, Parallel Sysplex, MVS/ESA, AIX, Intelligent Miner, WebSphere, Netfinity, Tivoli and Informix are trademarks or registered trademarks of IBM Corporation.

Linux is the registered trademark of Linus Torvalds in the U.S. and other countries.

Adobe, the Adobe logo, Acrobat, PostScript, and Reader are either trademarks or registered trademarks of Adobe Systems Incorporated in the United States and/or other countries.

Oracle is a registered trademark of Oracle Corporation.

UNIX, X/Open, OSF/1, and Motif are registered trademarks of the Open Group.

Citrix, ICA, Program Neighborhood, MetaFrame, WinFrame, VideoFrame, and MultiWin are trademarks or registered trademarks of Citrix Systems, Inc.

HTML, XML, XHTML and W3C are trademarks or registered trademarks of W3C®, World Wide Web Consortium, Massachusetts Institute of Technology.

Java is a registered trademark of Oracle Corporation.

JavaScript is a registered trademark of Oracle Corporation, used under license for technology invented and implemented by Netscape.

SAP, R/3, SAP NetWeaver, Duet, PartnerEdge, ByDesign, SAP Business ByDesign, and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP AG in Germany and other countries.

Business Objects and the Business Objects logo, BusinessObjects, Crystal Reports, Crystal Decisions, Web Intelligence, Xcelsius, and other Business Objects products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of Business Objects S.A. in the United States and in other countries. Business Objects is an SAP company.

All other product and service names mentioned are the trademarks of their respective companies. Data contained in this document serves informational purposes only. National product specifications may vary.

These materials are subject to change without notice. These materials are provided by SAP AG and its affiliated companies ("SAP Group") for informational purposes only, without representation or warranty of any kind, and SAP Group shall not be liable for errors or omissions with respect to the materials. The only warranties for SAP Group products and services are those that are set forth in the express warranty statements accompanying such products and services, if any. Nothing herein should be construed as constituting an additional warranty.

End To End Java Framework for SAP Enterprise MDM 7.1

$
0
0

Applies to:

SAP MDM 7.1 SP04

MDM Java API

SAP NetWeaver 7.0

Summary

This article is about a JAVA EE framework for building an integrated application on SAP NetWeaver Java WebDynpro utilizing MDM Java API. The Enterprise level Java Framework customizes the MDM Java API framework written by Richard LeBlanc of SAP, and adds few more layers to it. This gives a road map for creating a Java WebDynpro application which can be used for providing SAP MDM CRUD functionalities in a large project in distributed environment.

 

Author(s):    Kousik Mukherjee

Company:    HCL AXON

Created on:  17 January 2012

Author Bio

Kousik Mukherjee is currently working as Technical Specialist with HCL AXON, India at SAP Service Division. He is having 5 years of experience in SAP NetWeaver platform and ECC integration in heterogeneous areas like MDM and EP.

 

 

 

Table of Contents

The MDM Picture. 3

The Need for Master Data Management through Enterprise Portal 3

The Framework Architecture. 4

The Understanding of the Flow. 4

Here, we will try to understand each layer by sighting examples of the codes.4

The WD View. 4

WD Button Action: View.. 4

WD Custom Controller: CustomsController5

The DTO.. 5

Custom DTO.. 5

The Business Delegate. 6

Business Delegate Factory. 6

Item Create Delegate. 6

Item Create Delegate Impl6

Service Locator7

The EJB. 8

Item Creation Stateless Session Bean Home. 8

Item Creation Stateless Session Bean. 8

Item Creation Stateless Session Bean Class. 8

The DAO.. 8

DAO Factory. 8

Item Create DAO.. 9

Item Create DAO Impl9

The MDM Java API 10

MDM Connection Manager10

Server11

Server Impl11

  1. Repository. 12

Repository Impl12

  1. Session. 13

Session Impl13

The End Result 14

Copyright 16

 

 

The MDM Picture

All businesses, no matter what their size, rely on data to record and analyze business activity. It is the lifeblood of any business operation. Data enters the enterprise during specific process activities, either through the keyboard, via electronic messages or via electronic files. It then flows throughout the enterprise to support every process activity from registering new customers and sales order taking to supplier procurement, product fulfillment, product delivery, invoicing and payment collection.  There are two broad categories of structured data that any business relies on.

  1. Master Data
  2. Transaction Data

    

Master data is simply the data associated with core business entities such as customer, employee, supplier, product, partner, asset, etc. This data can reside in many different systems.

 

Transaction data, on the other hand, is very simple and straight forward. This is the recording of business transactions such as purchase orders in manufacturing, leave applications in HR and credit card payments in banking.

 

There are many reasons why master data management is needed in business. New business challenges such as globalization, mergers and acquisitions, risk, compliance, customer loyalty, supply chain complexity and costcutting would all benefit from having master data management in place.

The Need for Master Data Management through Enterprise Portal

When subsets of master data exist in multiple operational systems, it is not uncommon to see that data being independently maintained by each of those systems. When this happens, it is obvious that multiple data entry applications can cause problems that affect business operations, business performance and customer satisfaction. These problems include:

  • Data conflicts
  • Confusion when duplicate master data does not agree
  • Process delays and process defects caused by data errors
  • Inability to respond when changes to master data require prompt business action
  • Additional operational costs to solve problems caused by data conflicts

In order to counter these problems the SAP MDM system should provide the capabilities of easy access through enterprise portal with respect to its management of data associated with specific master data entities like customer, product, employee, etc.

 

 

 

 

 

 

 

 

 

The Framework Architecture

A Java WebDynpro Application can be suited to integrate the CRUD operations in SAP MDM with SAP EP. The Java Framework on above the MDM Java API comprises of 6 layers as shown below.

 

Note: This 3 tier architecture caters to a large project, where EJB and DAO is used with a view that Business logic may reside in a separate system, and View (WD application) and Data (MDM system) may reside in different systems. The italics signify the interfaces.

The Understanding of the Flow

Here, we will try to understand each layer by sighting examples of the codes.

The WD View

WD Button Action: View

This is an example of a code in the ItemCreateView:

 

  //@@begin javadoc:onActionSave_Customs(ServerEvent)

  /** Declared validating event handler. */

  //@@end

  public void onActionSave_Customs(com.sap.tc.webdynpro.progmodel.api.IWDCustomEvent wdEvent )

  {

    //@@begin onActionSave_Customs(ServerEvent)

    wdThis.wdGetCustomsControllerController().saveCustomsData();

    //@@end

  }

WD Custom Controller: CustomsController

This is an example of a code in the CustomsController class:

 

  //@@begin javadoc:saveCustomsData()

  /** Declared method. */

  //@@end

  public void saveCustomsData( )

  {

    //@@begin saveCustomsData()

   

   String portalUserName = wdContext.currentContextElement().getPortalUserName();

   String itemNumber = wdContext.currentContextElement().getItemNumber();

   String selectedSendToERP = wdContext.currentContextElement().getSelectedSendToERP();

  

   CustomsDTO customsDTO = new CustomsDTO();

         

   customsDTO.setL_CountryOfOrigin(wdContext.currentCustomsElement().getL_CountryOfOriginAndDesc());

   customsDTO.setL_CustomsRemark(wdContext.currentCustomsElement().getL_CustomsRemark());  

   customsDTO.setL_MilitaryCategoryNumber(wdContext.currentCustomsElement().getL_MilitaryCategoryNumber());

   try {

      BusinessDelegateFactory.getInstance().getItemCreationDelegate().updateCustomsData(portalUserName,itemNumber,customsDTO,selectedSendToERP);

   } catch (Exception e) {

          // TODO Auto-generated catch block

          wdComponentAPI.getMessageManager().raiseException(e.toString(),false);

          e.printStackTrace();

   }

    //@@end

  }

 

The DTO

Custom DTO

This is an example of a Data Transfer Object class:

public class CustomsDTO implements Serializable {

  

   private String l_CountryOfOrigin = "";

   private String l_CustomsRemark = "";

   private String l_MilitaryCategoryNumber = "";

      public String getL_CountryOfOrigin() {

            return l_CountryOfOrigin;

      }

 

 

   public String getL_CustomsRemark() {

          return l_CustomsRemark;

   }

   public String getL_MilitaryCategoryNumber() {

          return l_MilitaryCategoryNumber;

   }

   public void setL_CountryOfOrigin(String string) {

          l_CountryOfOrigin = string;

   }

   public void setL_CustomsRemark(String string) {

          l_CustomsRemark = string;

   }

   public void setL_MilitaryCategoryNumber(String string) {

          l_MilitaryCategoryNumber = string;

   }

  }

The Business Delegate

Business Delegate Factory

This is an example of a Business Delegate Factory for delegating the flow to Item Creation:

public final class BusinessDelegateFactory {

  

   private static BusinessDelegateFactory delegateFactory = null;

   private BusinessDelegateFactory() {

   }     

   public static BusinessDelegateFactory getInstance() {

          if (delegateFactory == null) {

                delegateFactory = new BusinessDelegateFactory();

          }

          return delegateFactory;

   }     

   public ItemCreationDelegate getItemCreationDelegate()

          throws Exception {

          return new ItemCreationDelegateImpl();

   }

Item Create Delegate

This is an example of an Item Create Delegate Interface:

public interface ItemCreationDelegate {

 

   public void updateCustomsData(String mdmUserName, String itemNumber, CustomsDTO customsDTO, String sendToERP)throws Exception;

 

}

Item Create Delegate Impl

This is an example of an Item Create Delegate Implementation class:

public class ItemCreationDelegateImpl implements ItemCreationDelegate {

   

   public void updateCustomsData(String mdmUserName, String itemNumber, CustomsDTO customsDTO, String sendToERP)throws Exception

    {

   ServiceLocator.getInstance().getItemCreationSLSB().updateCustomsData(mdmUserName, itemNumber, customsDTO,sendToERP);

    }

}

Service Locator

This is an example of a Service Locator class:

public final class ServiceLocator {

  

   private Context jndiContext;

 

   // Local home and remote home stubs are cached against their JNDI names.

   private Map localHomeCache;

   private Map remoteHomeCache;

   private boolean lookupRemoteStub = true;

 

   private static ServiceLocator serviceLocator;

   private static final String DEFAULT_JNDI_HOST = "xxxxxx";

   private static final String DEFAULT_JNDI_PORT = "99999";

 

   private static final String INITIAL_CONTEXT_FACTORY =

          "com.sap.engine.services.jndi.InitialContextFactoryImpl";

 

   private ServiceLocator() throws ServiceLocatorException {

         

          try { PropertyLoader loader = PropertyLoader.getInstance();

               

                //String jndiHost = DEFAULT_JNDI_HOST;

                String jndiHost =  loader.getProperty("JNDI_HOST",DEFAULT_JNDI_HOST);

                //String jndiPort = DEFAULT_JNDI_PORT;

                String jndiPort = loader.getProperty("JNDI_PORT", DEFAULT_JNDI_PORT);

 

                String jndiURL = jndiHost + ":" + jndiPort;                         

               

                String lookupRemote = loader.getProperty("LOOKUP_REMOTE", "false");

                lookupRemoteStub = Boolean.valueOf(lookupRemote).booleanValue();

                                    

                Properties props = new Properties();

                props.put(Context.PROVIDER_URL, jndiURL);

                props.put(Context.INITIAL_CONTEXT_FACTORY, INITIAL_CONTEXT_FACTORY);

 

                if(lookupRemoteStub)

                       jndiContext = new InitialContext(props);

                else  

                       jndiContext = new InitialContext();

                                    

                localHomeCache = new HashMap();

                remoteHomeCache = new HashMap();

               

          } catch (Exception exp) {

               

                throw new ServiceLocatorException(

                       "Failed to perform JNDI lookup.", exp);

          }

   }

   public static ServiceLocator getInstance() throws ServiceLocatorException {

         

          if (serviceLocator == null) {

                serviceLocator = new ServiceLocator();

          }

          return serviceLocator;

   }

}

The EJB

Item Creation Stateless Session Bean Home

This is an example of an Item Creation Stateless Session Bean Home interface:

public interface ItemCreationSLSBHome extends EJBHome {

 

   /**

    * Create Method.

    */

   public ItemCreationSLSB create() throws CreateException, RemoteException;

 

}

Item Creation Stateless Session Bean

This is an example of an Item Creation Stateless Session Bean interface:

public interface ItemCreationSLSB extends EJBObject {

 

   /**

    * Business Method.

    */

public void updateCustomsData(String mdmUserName, String itemNumber, CustomsDTO customsDTO, String sendToERP)throws Exception;

  

}

Item Creation Stateless Session Bean Class

This is an example of an Item Creation Stateless Session Bean class:

public class ItemCreationSLSBBean implements SessionBean {     

  

   public void updateCustomsData(String mdmUserName, String itemNumber, CustomsDTO customsDTO, String sendToERP)throws Exception

   {

   DAOFactory.getInstance().getItemCreateDAO().updateCustomsData(mdmUserName, itemNumber,customsDTO,sendToERP);

   }     

}

The DAO

DAO Factory

This is an example of a DAO Factory for moving the control to Item Create DAO:

       public final class DAOFactory {

           private static DAOFactory instance;

           private DAOFactory() {

           }

           public static DAOFactory getInstance() {

              if (instance == null)

                     instance = new DAOFactory();

              return instance;

           }

           public ItemCreateDAO getItemCreateDAO() {

                 return new ItemCreateDAOImpl();

           }

       }

Item Create DAO

This is an example of an Item Create Data Access Object inteface:

public interface ItemCreateDAO {

  

   public void updateCustomsData(String mdmUserName, String itemNumber, CustomsDTO customsDTO, String sendToERP)throws Exception;  

}

Item Create DAO Impl

This is an example of an Item Create DAO Implementation class:

public class ItemCreateDAOImpl implements ItemCreateDAO {       

 

    private RepositorySchema repositorySchema = null;

    private UserSessionContext userSessionContext = null;

    private RepositorySessionContext repositorySessionContext = null;

    private String lookupFlatTableName = "";

    private String keyFieldName = "";

    private String valueFieldName = "";    

 

public void updateCustomsData(String mdmUserName, String itemNumber, CustomsDTO customsDTO, String sendToERP) throws Exception {

          // TODO Auto-generated method stub

               

          String mainTableName = MDMSchema.Items.TABLE;

          String searchFieldName = MDMSchema.Items.ITEM_NUMBER;

          String searchFieldValue = itemNumber;

          String searchFieldConstraint = "EQUALS";

          String[] selectedMainTableFieldNameArray = {

                                            MDMSchema.Items.LOCAL_DATA,

                                            MDMSchema.Items.INTRASTAT_ITEM,                                                      MDMSchema.Items.SUPPLIER_PRICE_LIST_DATA

                                               };

try{     

          SearchProgram searchProgram = FieldSearchProgram.TEXT;

          TableSchema tableSchema = repositorySchema.getTableSchema(mainTableName);

          Record[]  records = searchProgram.execute(repositorySchema, userSessionContext, tableSchema, mainTableName, searchMDMRecordDetails);

          FieldValuePair[] fieldValuePairs = new FieldValuePair[3];

          if(records!=null)

          {

                TableSchema mainTableSchema = repositorySchema.getTableSchema(mainTableName);

                TableSchema lookUpTableSchema = null;

                QualifiedLookupValue qlv = null;

                QualifiedLinkValue[] links = null;

                StringValue stringValue = null;                

 

                records[0] = dataProgram.getRecordByID(userSessionContext,mainTableSchema,records[0].getId());

                             

                fieldValuePair = new FieldValuePair(mainTableSchema.getFieldId(MDMSchema.Items.INTRASTAT_ITEM),new BooleanValue(customsDTO.isG_IntrastatItem()));

                fieldValuePairs[0] = fieldValuePair;

                             

                qlv = (QualifiedLookupValue)records[0].getFieldValue(mainTableSchema.getFieldId(MDMSchema.Items.LOCAL_DATA));

                links = qlv.getQualifiedLinks();

}

 

The MDM Java API

MDM Connection Manager

This is an example of a MDM Connection Manager class:

public class MDMConnectionManager {

               

   private static String MDM_SERVER_HOST = "xxxxxxxx";

   private static String MDM_REPOSITORY_NAME ="xxxxxxxx";

   private static String MDM_REGION_NAME = "English [US]";

 

   private static Repository[] repositories = null;

   private static Repository repository = null;

   private static RegionProperties regionProperties = null;

   private static RepositorySchema repositorySchema = null;

 

  

   public static Repository getRepository(String userName) throws Exception {

 

   Server server = Server.getInstance(MDM_SERVER_HOST);

   repositories = server.getRepository();

  

   for (int i = 0, j = repositories.length; i < j; i++) {

 

          if (repositories[i].getIdentifier().getName().equals(MDM_REPOSITORY_NAME)) {

          repository = repositories[i];

    }

    }

   RegionProperties[] regions = repository.getRegions();        

 

   RegionProperties region = null;

   for (int i = 0, j = regions.length; i < j; i++) {

 

          if (regions[i].getName().equals(MDM_REGION_NAME)) {

                region = regions[i];

          }

   }

   repository.login(region, userName);

   return repository;

   }

}

 

Server

This is an example of a Server Abstract class:

abstract public class Server {

 

   protected String hostName;

 

   /**

    * Gets a server using the specified host name.

    *

    * @param hostName - the host name of the MDM Server

    * @return a server for the given host name

    */

   static public Server getInstance(String hostName) {

         

          Server server = new ServerImpl(hostName);

          return server;            

   }

Server Impl

This is an example of a Server Implementation class:

class ServerImpl extends Server {

 

   private Repository[]  repositories;

 

   ServerImpl(String hostName) {

         

          this.hostName = hostName;

         

          loadRepository();         

   }

  

   /*

    * Loads all mounted repositories for this server.

    */   

   private void loadRepository() {                

         

          GetMountedRepositoryListCommand cmd = new GetMountedRepositoryListCommand(getConnection());

 

          try {

                cmd.execute();

 

          } catch (CommandException e) {

 

                e.printStackTrace();

          }

 

          RepositoryIdentifier[] repIdentifiers = cmd.getRepositories();

         

          repositories = new Repository[repIdentifiers.length];

         

          for(int i=0, j=repIdentifiers.length; i<j; i++) {

               

                repositories[i] = new RepositoryImpl(this, repIdentifiers[i]);

                      

          }                   

   }     

Repository

This is an example of a Repository interface:

public interface Repository {       

 

   public void login(RegionProperties region, String user);

   public void login(RegionProperties region, String user,String password );

  

   /**

    * Return the server this repository belongs to.

    */

   public Server getServer();

         

   /**

    * Returns the schema for this repository.

    *

    * @see com.sap.mdm.schema.RepositorySchema

    */

   public RepositorySchema getSchema();

  

   /**

    * Returns the session for this repository.

    *

    * @see com.sap.nw.mdm.rig.session.Session

    */

   public Session getSession();     

}

Repository Impl

This is an example of a Repository Implementation class:

class RepositoryImpl implements Repository{

  

   private Server server;

  

   private RepositoryIdentifier repositoryIdentifier;

         

   RepositoryImpl(Server server, RepositoryIdentifier repositoryIdentifier) {

         

          this.server = server;            

          this.repositoryIdentifier = repositoryIdentifier;            

   }

         

   public Server getServer() {      

          return server;            

   }

  

   private Session session;

   public void login(RegionProperties region, String user) {

         

          session = new SessionImpl(server.getConnection(), repositoryIdentifier,

                                                                region, user);                                                        

          this.region = region;            

   }

  

   public void login(RegionProperties region, String user,String password) {

         

          session = new SessionImpl(server.getConnection(), repositoryIdentifier,

                                                          region, user,password);

                                                         

                this.region = region;            

          }

   public Session getSession() {

         

          return session;           

   }     

}

Session

This is an example of a Session interface:

public interface Session {

 

   public String getUserSession();  

  

   public String getRepositorySession();   

  

   public void destroy();

}

Session Impl

This is an example of a Session Implementation class:

class SessionImpl implements Session{

 

   private String userSession;

   private String repositorySession;

   private ConnectionAccessor connection;

  

   SessionImpl(ConnectionAccessor connection, RepositoryIdentifier repository, RegionProperties region, String user) { 

         

          this.connection = connection;           

          repositorySession = getRepositorySession(connection, repository);

         

          authenticateRepositorySession(connection, repositorySession, user);       

          userSession = getUserSession(connection, repository, region);       

          authenticateUserSession(connection, userSession, user);                                    

   }

}

The End Result

Now, it is obvious that since we have spent so much of our effort in understanding and building an EP & MDM integrated application, it is important to see the end result. Using our framework, we can develop a WebDynpro Application which will Search, Modify, Add, Remove (CRUD) data from a SAP MDM 7.1 Repository.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Related Content

SAP NetWeaver MDM Java API

Java Development

WebDynpro Java

 

 

Copyright

© Copyright 2012 SAP AG. All rights reserved.

No part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP AG. The information contained herein may be changed without prior notice.

Some software products marketed by SAP AG and its distributors contain proprietary software components of other software vendors.

Microsoft, Windows, Excel, Outlook, and PowerPoint are registered trademarks of Microsoft Corporation.

IBM, DB2, DB2 Universal Database, System i, System i5, System p, System p5, System x, System z, System z10, System z9, z10, z9, iSeries, pSeries, xSeries, zSeries, eServer, z/VM, z/OS, i5/OS, S/390, OS/390, OS/400, AS/400, S/390 Parallel Enterprise Server, PowerVM, Power Architecture, POWER6+, POWER6, POWER5+, POWER5, POWER, OpenPower, PowerPC, BatchPipes, BladeCenter, System Storage, GPFS, HACMP, RETAIN, DB2 Connect, RACF, Redbooks, OS/2, Parallel Sysplex, MVS/ESA, AIX, Intelligent Miner, WebSphere, Netfinity, Tivoli and Informix are trademarks or registered trademarks of IBM Corporation.

Linux is the registered trademark of Linus Torvalds in the U.S. and other countries.

Adobe, the Adobe logo, Acrobat, PostScript, and Reader are either trademarks or registered trademarks of Adobe Systems Incorporated in the United States and/or other countries.

Oracle is a registered trademark of Oracle Corporation.

UNIX, X/Open, OSF/1, and Motif are registered trademarks of the Open Group.

Citrix, ICA, Program Neighborhood, MetaFrame, WinFrame, VideoFrame, and MultiWin are trademarks or registered trademarks of Citrix Systems, Inc.

HTML, XML, XHTML and W3C are trademarks or registered trademarks of W3C®, World Wide Web Consortium, Massachusetts Institute of Technology.

Java is a registered trademark of Oracle Corporation.

JavaScript is a registered trademark of Oracle Corporation, used under license for technology invented and implemented by Netscape.

SAP, R/3, SAP NetWeaver, Duet, PartnerEdge, ByDesign, SAP Business ByDesign, and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP AG in Germany and other countries.

Business Objects and the Business Objects logo, BusinessObjects, Crystal Reports, Crystal Decisions, Web Intelligence, Xcelsius, and other Business Objects products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of Business Objects S.A. in the United States and in other countries. Business Objects is an SAP company.

All other product and service names mentioned are the trademarks of their respective companies. Data contained in this document serves informational purposes only. National product specifications may vary.

These materials are subject to change without notice. These materials are provided by SAP AG and its affiliated companies ("SAP Group") for informational purposes only, without representation or warranty of any kind, and SAP Group shall not be liable for errors or omissions with respect to the materials. The only warranties for SAP Group products and services are those that are set forth in the express warranty statements accompanying such products and services, if any. Nothing herein should be construed as constituting an additional warranty.


MDM transport

$
0
0

Hi All,

 

This document is intended to assist you all when performing MDM transport done using MDM console. The pre and post process may differ according to your configuration.

 

This document covers the manual MDM steps for the transports. Overview of the complete process is listed in below six points.

1. Stop MDM Adapter on PI

2. Unload MDM Repository

3. Transport MDM Repository

4. Load MDM Repository with Update Indices

5. Start the MDM Adapter on PI

6. Monitor PI for MDM messages

 

 

1. Stop MDM Adapter on PI

Login into the NWA (NetWeaver Administrator) on the relevant PI Java Engine

 

E.g.: http://hostName:port/nwa

      -> go to Operation Management

      -> Systems

      -> Start & Stop

      -> Java EE Applications

      -> Search for Name = tc~mdm~adapter~svc

      -> Highlight that row

      -> STOP the application  (for all Instances)

Untitled.jpg

 

2. Unload MDM Repository

a. Open the MDM Console for the MDM Server you are transporting to

b. Mount the MDM Server

     Untitled.jpg

c. In the following screen provide the server and (port number if required) to mount MDM server.

     Untitled.jpg

d. Right click on the Repository name for the mounted MDM server and select Connect to Repository, enter user credentials in poped window...

     Untitled.jpg

     Untitled.jpg

e. The screen will look something like this (Notice that the Repository has a small green arrow next to it, that means it is LOADED).

     Untitled.jpg

f. Unload the repository

      Untitled.jpg

    

 

3. Transport MDM Repository

a. Import the MDM delta file

     Untitled.jpg

b. Select the request number in the following screen.

     Untitled.jpg

Press Ok and transport is imported within short period of time and a successful message is dispalyed. If any errors are generated then the MDM technical resource needs to be reached for solving the issue.

   

   

4. Load MDM Repository with Update Indices

     On successful import load the repository with Update Indices

  Untitled.jpg

 

Import is now done!!!

 

5. Start the MDM Adapter on PI

Login into the NWA (NetWeaver Administrator) on the relevant PI Java Engine


E.g.: http://hostName:port/nwa

      -> go to Operation Management

      -> Systems

      -> Start & Stop

      -> Java EE Applications

      -> Search for Name = tc~mdm~adapter~svc

      -> Highlight that row

      -> START the application (for all Instances)

     Untitled.jpg

 

6. Monitor PI for MDM messages

a. Open RWB from Internet Explorer; http://hostName:port/rwb/index.jsp

b. Click Display to open Component Monitoring. Click on Adapter Engine in the list of components, then in Status window, click on Communication Channel Monitoring, see screenshot below.

     Untitled.jpg

c. Filter on Adapter Type MDM to get at list of all MDM adapters, MDM channels will have status “unknown” or error message “not registered”.

       Untitled.jpg

d.  After startup of MDM adapter Service the status should be “functioning”. An error in the adapter might be resolved by stopping and starting the adapter.

 

DONE!!!

 

Hope this document is helpful enough for MDM trasnport imports.

 

Regards,

Prithviraj

PIM-PLM Communications using SAP MDM

$
0
0

Applies to:

 

SAP MDM 7.1

 

Summary:

 

Product Master Data Management is a key component of Information Management solution. Organizations across various Industries are initiating Digital Transformation Programs to support achieving business objectives and growth. IT landscapes are becoming complex as numerous applications are being deployed in the applications landscape. It puts critical requirements on Product Master Data Management. Typically PLM is the system of author for Product Information and PIM as system of reference along with product information enrichment. Ideally it is expected that PLM push quality information to PIM to be enriched further for further syndication to other downstream systems. Although there are instances where PLM also demands enriched and cleansed product information back from PIM. This white paper discuss through illustration as how this PLM-PIM communication is handled from PIM perspective using SAP MDM.

 

Author: Ajay Vaidya

Company: Tata Consultancy Services

Created on: 20 December 2014

 

Author Bio

Ajay has been associated with Tata Consultancy Services (world-leading information technology consulting, services organization). He is handling the responsibility of Head of Product MDM Practice in TCS.

 

Context:

Product Master Data Management is a key component of Information Management solution. Organizations across various Industries are initiating Digital Transformation Programs to support achieving business objectives and growth. IT landscapes are becoming complex as numerous applications are being deployed in the applications landscape. It puts critical requirements on Product Master Data Management. Typically PLM is the system of author for Product Information and PIM as system of reference along with product information enrichment. Ideally it is expected that PLM push quality information to PIM to be enriched further for further syndication to other downstream systems. Although there are instances where PLM also demands enriched and cleansed product information back from PIM. This white paper discuss through illustration as how this PLM-PIM communication is handled from PIM perspective using SAP MDM.

 

Managing information integration is a challenging aspect. It is even more challenging when it is required to do “two-way” communications. Due care has to be taken while designing the solution so that it should not corrupt the product data accidently.

 

Communication between PLM and PIM

Typically when PLM is present in the solution, it becomes the author of the Product Information. PLM orchestrates the process of physical Product creation through collaboration with various stakeholders and designing systems.

 

Following diagram illustrate normal scenario of PLM-PIM integration. PLM starts the Product creation process (more common for manufacturing Industry). At the end of the Product creation process, PLM push the Product information to PIM. PIM further enrich the Product information considering the various channels and downstream systems (corresponding to various functional units like billing, customer interface, logistics etc)

Typical PLM-PIM.jpg

Product MDM Design for PLM-PIM Communications

 

It is not recommended to have two way communications between PLM and PIM. Therefore as far as possible it should be avoided. But if it is absolutely necessary for PLM to receive enriched and cleansed Product information from PIM, then solution has to be designed in such a way that two way communication does not create a recursive communication loop between PLM and PIM.

 

There is no “The Correct” solution. There could multiple ways to handle to the two way communication. Following diagram illustrates one way to handle this two way communication scenario between PLM and PIM.

 

PLM-PIM Solution.jpg

 

Following aspects are considered the Product information flow:

1)     Separate out the identity of Product Information coming from PLM and other sources. Identify the source of Product Information

2)     Identify the trust factor on PLM Product Information. For instance, not all attributes that are coming from PLM are to be trusted and move to PIM. Only selective attributes that are core physical Product characteristics and has high confidence of information quality are to be copied to PIM.

3)     PLM to subscribe for Product publish from PIM and take in selective attributes.

4)     PLM not to publish Product Information back again to PIM if the feed has come from PIM for the specific update to Product Information.

 

Following section describes point ‘1’ and ‘2’ as how it is taken care in SAP MDM. SAP PLM is out of scope of this white paper and hence not illustrated here.

 

Enabling PLM-PIM two way communications in SAP MDM

 

 

This section illustrates one way to configuring SAP MDM to enable PLM-PIM two way communications.

 

Following diagram describes the various SAP MDM components configuration. Subsequently each component configuration is described with further details.

 

SAP MDM Components1.jpg

 

 

ElementDescription
Product

Product Entity

Uniquely identified by Product ID

Name and Description are standard identifying attributes

Width, Height and Breadth are defined by PLM

Base Cost and Base Price are enriched by PLM

PLM Product ID identifies if the specific record corresponds to PLM feed or primary Product record

Associated Product is applicable for PLM feed record only. It associates primary Product record with PLM product record
Import – PLM Feed Data

This is the automated import comes from PLM. PLM sends the attributes that PIM is interested in (width, height and breadth). It can also send redundant attributes like Base Cost. PIM ignores Base Cost information as it should be enriched by PIM and not by PLM. Such redundant attributes are ignored by PIM. While import, PLM Product ID is used to identify unique record in “Products” catalog.

This import is started as soon as PLM feed data is available in “Ready” folder of the “PLM Feed Data” port
WorkflowWorkflow is started on import of “PLM Feed Data” and update to PLM feed record only. Workflow triggers the Syndication (Port PLM Data) to export only necessary attributes and ignore redundant attributes. Therefore in this case only width, height and breadth attributes are export and Base Cost attribute is ignored.
Syndicate – Port PLM DataThis Syndication picks only necessary attributes out of PLM feed and create a separate exported feed to “Products” main catalog. This syndication also extracts the associated primary Product ID and inserts into the export. This is necessary to identify primary Product record and do the merging.
Import – Port Product ImportThis is the automated import. As soon as Syndicated file of port “PLM Data” is available in “Ready” folder of port “Product Import”, the import starts to Product catalog. It matches primary Product record using the Product ID attribute present in the imported data. Upon successful matching of Product record, it copies only width, height and breadth data from the imported feed.

 

SAP MDM Configuration

SAP MDM Console defines the company “ABC Company” and “Products” catalog along with associated hierarchy.

 

C1.JPG

 

Create “port” for :

1)    Inbound port for importing Product Master Data coming from PLM. The imported data goes to the “landing” row in “Products” catalog associated with PLM feed source. All information coming from PLM (in the scope of Product master data) is imported.

2)    Outbound port for exporting Product Information from “landing” row along with only selective attributes. Redundant attributes are ignored.

3)    Inbound port for importing the selective attributes that are exported by outbound port ‘2’. This data is imported to primary Product record in “Products” catalog.

 

Port: Inbound - Product Import

 

As illustrated in following diagram “PLM-Feed-Data” port is defined for ‘1’, “PLM Data” port is defined for ‘2’ and “Product import” port is defined for ‘3’

 

For all the ports above, for example, “Delimited Text” format is used.

 

C2.JPG

 

“Product import” port is defined as Inbound with Delimited Text format with “,” as separator and “Automatic” as processing type. Automatic processing type enables auto import to SAP MDM. As soon as specific format file is available for import in “Ready” folder of the port, SAP MDM automatically imports the PLM feed data.

 

Port: Inbound - PLM Feed Data

 

C3.JPG

“PLM Feed Data” inbound port is defined to import the selective attributes out of PLM Data feed. Format is again “Delimited Text” with “,” separated. Processing type is set as “Automatic” for SAP MDM to automatically import the file.

 

Port: Outbound - PLM Data

 

C4.JPG

“PLM Data” port is defined as outbound port to export the selective fields that are received by PLM. Fields / attributes that are authored by PLM and with having high quality data are considered for coping to PIM “Products” catalog main record.

 

Import Map: For Port PLM Feed Data

 

Corresponding to these ports, Import and Syndication mappings are define.

 

IM2.JPG

 

Map is defined as shown above for “PLM Feed Data” port. The map takes in the flat file “Delimited Text” form that comes from PLM. (Note: for the illustration purpose flat file format is used. XML format could be used as standard mechanism). Source data is mapped to a row in “Products” catalog corresponding to PLM data source.

 

IM3.JPG

Matching identification is done based on the “PLM Product ID” value in incoming information and corresponding “PLM Product ID” in “Products” catalog record. Action is set to “Update (All Mapped Fields)” for the match based on “PLM Product ID” matching.

 

Import Map: For Port Product Import

 

IMP1.JPG

Map is defined as illustrated in above diagram for the port “Product Import” to import selective attributes that has come from PLM. The selected fields are Breath, Height and Width. These are the engineering attributes and for the sake of example are perceived as high quality information that comes from PLM. Therefore these attribute values are copied to the corresponding primary Product record in “Products” catalog.

 

IMP2.JPG

 

Product information matching is done based on the “Product ID”. Action is set to “Update (All Mapped Fields)”. Therefore Breadth, Height and Width attributes information is copied on to the matched PIM record based on “Product ID”.

 

Syndication Map: For Port PLM Data

 

SD1.JPG

Syndication map is defined for port “PLM Data”. This map picks width, height and breadth attributes from PLM data row in “Products” catalog. Also it picks “Product ID” from “Associated Product” information that is maintained as part of the PLM product record. PLM product record maintains relationship with the primary Product record in “Products” catalog.

 

SD2.JPG

 

In order to make the mapping for syndication, destination items need to be defined as show in above diagram.

 

Workflow

 

 

There is one workflows defined.

1) PLM Feed: This workflow is used to do automation of selective attributes export as soon as PLM feed data is updated in “Products” catalog

 

W1.JPG

 

Workflow “PLM Feed” is defined to take in Product data provided by PLM and generate automated export of selective attributes to be consumed later. Trigger action has “Record Import” as one of the valid criteria. Auto launch is set to “Immediate” so that automatic launch could happen without manual intervention.

 

W11.JPG

 

For illustration purpose, workflow is kept at simple level. It has two steps. The “branch” step checks if the current updated record in “Products” catalog is due to PLM Feed or if primary Product record is updated. If it is not from PLM Feed, then workflow goes to “stop” stage and ends. But if it is from PLM feed, then next “syndication” step is executed. Syndication step export the selective attributes from PLM feed data.

 

Summary

As far as possible, if not critical, two way communication between PIM source and PIM should be avoided. But if it is necessary in some cases like PLM-PIM communication, proper care should be taken to  filter and check the information flow between then in both direction. If proper checks are not deployed, it may lead to Product data corruption and at times PLM and PIM can go complete out of sync.

 

Related Content

SAP MDM Console Guide

SAP MDM Data Manager Guide

Key Capabilities of MDM

Automated Housekeeping and Monitoring Activities in MDM

$
0
0

Deletion of Logs, report and files from distribution folder

 

o   Automated Deletion of logs

 

§  Executed on regular intervals (Weekly), Data deleted older than 90 days.

§  Data to be maintained for Last 90 days.

§  Logs deletion can be carry out with Redwood jobs (<SID>_MDM_ALL_HOUSEKEPNG_SCRIPT) or cron job which runs on  MDM system. In Redwood job you need to call script which have below command

find /usr/sap/<SID>/MDS<XX>/log -name '*' -mtime +90 -exec rm {} \;

§  Cross check for logs deletion in case of automated mechanism failure Location for Logs to be deleted  :

·         /usr/sap/<SID>/MDS<XX>/log

 

          • Cron job can be configure through crontab -e. Use following command to delete logs older than 90 Days. And this job will run on weekly basis

                    00 03 * * 0 find /usr/sap/<SID>/MDS<XX>/log -name '*' -mtime +90 -exec rm {} \;

 

 

 

o   Automated Deletion of Reports

 

§  Executed on regular intervals (Weekly), Data deleted older than 90 days.

§  Data to be maintained for Last 90 days.

§  Report deletion carry out with Redwood jobs (<SID>_MDM_ALL_HOUSEKEPNG_SCRIPT) or cron job which runs on MDM system. In Redwood job you need to call script which have below command

find /usr/sap/ZCM/MDS01/mdm/reports -name '*' -mtime +90 -exec rm {} \;

§  Cross check for Reports deletion in case of automated mechanism failure Location for Reports to be deleted 

·         /usr/sap/<SID>/MDS<XX>/mdm/reports

 

 

          • Cron job can be configure through crontab -e. Use following command to delete logs older than 90 Days. And this job will run on weekly basis

                         00 03 * * 0 00 03 * * 0 find /usr/sap/<SID>/MDS<XX>/mdm/reports -name '*' -mtime +90 -exec rm {} \;

 

 

o   Automated Deletion of files in distribution folder

 

§  Performed on Daily Basis, Data deleted older than 30 days.

§  Data to be maintained for Last 30 days. Files in the distribution folder should be less than 50K.

§  Cross check for files deletion in case of automated mechanism failure Location for Files to be deleted.

§  File deletion carry out with Redwood jobs (<SID>_MDM_ALL_HOUSEKEPNG_SCRIPT) and cron job which runs on MDM system. In Redwood job you need to call script which have below commands (.... for each and every ports in Inbound and Outbound for all repos. Archive, Exception and Ready folder)

find /usr/sap/<SID>/MDS<XX>/mdm/distributions/<DBSID>_ORCL/<Repository Name>/Inbound/...... -name '*' -mtime +30 -exec rm {} \;

find /usr/sap/<SID>/MDS<XX>/mdm/distributions/<DBSID>_ORCL/<Repository Name>/Outbound/...... -name '*' -mtime +30 -exec rm {} \;

 

          • Cron job can be configure through crontab -e. Use following command to delete logs older than 90 Days. And this job will run on weekly basis

 

02 09 * * * find /usr/sap/<SID>/MDS<XX>/mdm/distributions/<DBSID>_ORCL/<Repository Name>/Inbound/...... -name '*' -mtime +30 -exec rm {} \;

02 09 * * * find /usr/sap/<SID>/MDS<XX>/mdm/distributions/<DBSID>_ORCL/<Repository Name>/Outbound/...... -name '*' -mtime +30 -exec rm {} \;

 

 

o   CLIX Monitoring

§  ConstantClix monitoring enabled via OS Level.

§  Command to be executed for Clix Monitoring:

 

nohup clix mdsMonitor <hostname> Admin:sapmdm –W –C –T 5 >> clix_mon.out 2>&1 &

 

§  File “clix_mon.out" is stored on “/usr/sap/<SID>/user/<sid>adm”

 

Note: This monitoring runs in the background, upon anticipating that high volume of file or completion of 2 weeks, whichever earlier. Clix monitoring process to be killed, rename the filename and process to be re executed.

 

§  File “clix_mon.out*” to be maintained for last 90 days.

 

 

Tuple Hierarchy Maintenance In SAP MDM

$
0
0

Tuple Hierarchy Maintenance


We are about to replace a Hierarchy Table with multi valued Hierarchy.By this approach we are not required to link the least node of the hierarchy in the main table. No need of separate two different inbound loads to MDM system from Source system. We can save much and more memory.the search will be similar to hierarchy drill down in Main table. Let see the overall steps one by one.

 

Steps for creating Tuple Hierarchy

We are going to create a Tuple hierarchy table with 4 levels, each level has Code and Description. So create 4 lookup flat tables and please note we need to mention both code and description as Display field.

Incase if code and description is not created with Display field, then we can’t create compound field in Inbound Map.

  • Create 4 lookup Flat Tables (level 1, level 2, level 3 and level 4).

 

  • Data Structure of all lookup flat tables is Code and Desc.
  • Display field is enabled for all lookup Flat tables in the MDM.

  • Please maintain Code 2 Desc2 in the other lookup Flat tables.
  • Create a Tuple Multi Valued table, which will be later linked to Main Table.

  • The Data structure of the Tuple is as mentioned below.
  • Level 1

  • Level 2

            

  • Level 3

  • Level 4

 

 

 

  • L1 field is mapped to Level 1 lookup Flat and the L2 is linked to Another Tuple table.
  • Now Create a Main Table where we need to link the first Tuple table into that.

  • Now Start the Repository with Update indices and open the Import Manager for creating inbound Map.

 

Inbound Map Creation:

  • Import the delimited text file as per to create the inbound map.
  • Choose the Source and destination table.
  • Now map the fields of Source file to Destination table as mentioned below.

  • Now we need to create Compound field and map to the destination Tuple field.

  • We have to create all 4 level compound fields and all of them will be automatically mapped to the required destination field.
  • Once the compound field created you can see all the 4 fields in the Source field Pane.

  • Choose the MDIS Unmapped Value handling as “Add” for all 4 levels of hierarchy.

 

  • Now we need to configure the matching field of Tuple, as per the below option.

  • The Replace option is generally to make the hierarchy sync with source system updates.

  • Choose the Matching field and save the map.
  • Create the Port for auto processing and start the auto load from the source system.
  • Once the data processed successfully, the hierarchy structure will be as mentioned below.

 

 

 

 

 

Syndication of Hierarchy to Target system:

 

  • Now the Target system may require parent and node id information, this can be achieved by the help of Middle-ware system.

 

 

 

 

 

Getting Started with SAP NetWeaver Master Data Management

$
0
0

Today, large and mid-scale companies operating on the base of diversified IT landscapes often suffer from master data that is inconsistently stored in multiple, disconnected systems or databases. Unmanaged master data is notoriously inaccurate, redundant, and full of discrepancies, all of which result in high maintenance costs, vulnerable business processes and poor business decisions.

SAP NetWeaver Master Data Management (MDM) enables companies to consolidate and harmonize their master data within heterogeneous IT landscapes. It consistently delivers vastly reduced data maintenance costs, ensures cross-system data consistency, accelerates the execution of business processes, and greatly improves decision-making.

 

 

Overview

SAP NetWeaver Master Data Management Solution Brief

This solution brief provides an overview of the master data management features and processes of SAP NetWeaver Master Data Management.

 


Getting Started with MDM Guides 

Release 7.1

This document gives a high-level overview of SAP NetWeaver Master Data Management (MDM), including its functional components and main features. A tutorial specifically dedicated to new MDM customers provides a guided tour through the product and allows the readers to get hands-on experience.

MDM Tutorial Sample Data

 

Release 7.1

This package contains the sample data required to go through the step-by-step tutorial for new MDM customers:

  • Step-by-Step Example of Harmonizing Master Data for Release 7.1

 

SAP NetWeaver MDM Overview  

This presentation provides an overview of SAP NetWeaver MDM including main capabilities, applicable scenarios, and customer examples. For deeper insight into the master data integration, master data operation and master data quality capabilities, see the level 2 presention.

 

SAP NetWeaver Master Data Management 7.1  

Get information on the current release, SAP NetWeaver MDM 7.1.

 

More Information

Misaligned Master Data is a Compromised Corporate Asset 

Read this SAPInsider article to get a comprehensive overview of SAP NetWeaver Master Data Management in a business context.

Selective Syndication

$
0
0

Applies to:


SAP MDM 7.1


Summary


Organizational boundaries are becoming thin incorporating external partners into the preview of enterprise MDM solution. It is inevitable to syndicate Master Data information to external parties like Business Partners, Vendors, Support Centers, Data Provides and many others. The ideal solution to do the data transformation and syndication of master data information specific to the needs of each external party is in the integration layer. However as a quick win solution, very often it is necessary to build a syndication solution within MDM for master data syndication to high priority partners. It is the complex demand on MDM than as it looks simple on the face of it. Master data needs in terms of specific attributes requirements are most likely differs for different external partners.


This document talks about as how such syndication requirements can be supported within SAP MDM as stop gap arrangement to achieve quick win.

 

Author(s):


Ajay Vaidya


Company:


Tata Consultancy Services


Created on:


09 September 2015


Author Bio


Ajay has been associated with Tata Consultancy Services (world-leading information technology consulting, services organization). He is handling the responsibility of Head of Item MDM Practice in TCS.



Introduction


Master Data Management solutions are widely accepted by organizations to maintain single integrated view of Master Data. Organization have been leveraging MDM solutions to enable data driven business growth.

 

Business environments are changing. Digital wave has enabled business to collaborate and grow collectively. Organizational boundaries are becoming super thin in terms of information exchange and information consumption.

 

Organizational boundaries are becoming thin incorporating external partners into the preview of enterprise MDM solution. It is inevitable to syndicate Master Data information to external parties like Business Partners, Vendors, Support Centers, Data Provides and many others. The ideal solution to do the data transformation and syndication of master data information specific to the needs of each external party is in the integration layer. However as a quick win solution, very often it is necessary to build a syndication solution within MDM for master data syndication to high priority partners. It is the complex demand on MDM than as it looks simple on the face of it. Master data needs in terms of specific attributes requirements are most likely different for different external partners.

 

This document talks about as how such syndication requirements can be supported within SAP MDM as stop gap arrangement to achieve quick win.

 

Typical MDM syndication view


Master Data Management solution acts like a heart pumping quality Master Data information to enterprise. At the same time it provides specific and filtered Master Data information to specific external parties.

 

As shown in the diagram below, MDM maintaining single integrated view of Master Data. MDM source master information from various sources and as well author/enrich within MDM solution. On the other side Master Data is consumed by various enterprise wide application and also Master Data syndicated to external parties.


Syndication Landscape jpeg.jpg


Let us consider for the illustration purpose, Business Partner A needs are critical and needs to be handled immediately on priority basis. Whereas other Business Partners needs can wait till the integration layer solution is deployed and available. In ideal situation, it is recommended that the Master Data formatting is done as part of the integration layer. However in certain quick win scenarios it becomes absolute critical that syndication needs for Business Partner A are taken care within MDM solution itself as a stop gap arrangement. Following section illustrate the data model to explain the scenario to be considered for this article.


Electrical Systems Master Data



Let us consider for illustration purpose scenario of manufacturing organization with electrical systems product line. Bulb and lighting is one segment of Electrical Systems.


Data Model.jpg

 

There are certain attributes that are common across all products as base attributes. Whereas there are category specific attributes that are applicable to products that belong to those specific categories.

 

Typically in normal circumstances when product master data information is syndicated to downstream business partners, all attributes are considered and published.

 

However it is very often seen that there would be restriction in terms of a specific attributes demand by external business partners. In such cases it is necessary to choose key selective attributes and consider those for syndication to external partner. Scenario become worst when there are different key base attributes that are to be considered for different category products. There is no straight forward way to achieve this filtering of basic attributes based on categories under consideration.


Consider a scenario where only following base attributes are to be syndicated for specific category products to external business partner.

 

 

 

Lifetime

Lamp Efficiencies

Color Code

Design Temperature

Lamp Wattage

Dimmable

Energy Efficiency Level

Resistor

Length

Color Designation

Halogen Lamp

 

 

Y

Y

 

 

 

Y

 

 

Infrared Lamp

 

 

 

 

 

 

 

 

 

 

Incandescent Lamp

 

 

Y

Y

 

 

 

 

 

 

Fluorescent Lamp

 

 

Y

 

 

 

 

 

 

 

Solar Lamp

 

 

 

 

 

 

 

 

 

 

Medical Lamp

 

 

Y

 

 

 

 

 

 

Y

 

 

As illustrated in above table, for Halogen Lamp based products, attributes “Color Code”, “Design Temperature” and “Resister” need to be syndicated to external business partner. Whereas for Medical Lamp product, attributes “Design Temperature” and “Resister” values are not supposed to be syndicated.

 

This table illustrates a simple case of selective filtering of syndication attributes based on product category. However in reality such needs could get complex. The ideal solution to manage such needs are in integration layer. However in some cases for critical and urgent requirements one may have to generate syndication export directly from MDM system for specific key and critical Business Partner needs as a stop gap arrangement.


SAP MDM Solution


There are three aspects to be considered while defining a stop gap arrangement solution in SAP MDM.


  1. Copy attributes reserved for syndication purpose. These attributes are replica of the product base attributes. However values for these copy attributes are set as reflection of original base product attributes based on the category under consideration. This can be done automatically using “Auto Workflows” and “Branch Assignments”
  2. Define Branch Assignment rules to copy original attribute values to copy attributes only for specific category level branches. For other category branches it would not have any value copied.
  3. Define automated workflow to be triggered when product information is either created or updated. In this workflow, execute the various defined assignments. This automated workflow will ensure that whenever product attributes are changed, corresponding copy attributes are updated accordingly as per category under consideration.

 

SAP MDM Solution Behavior


Let us view as how these above three aspects translates into necessary behavior.


DM1.jpg


For Halogen types of Products “Color Code” should be copied from original “Color Code” value to another “Color Code” (SP_Color_Code) value that is specific reserved for product data export purpose.


When Product Manager changes Color Code value from 380 to 400.

 

DM2.jpg

 

As soon as “Color Code” value is changed, it runs automated workflow behind the scene and updates the SP_Color_Code value. However it does not update the Resister Ohm value as it is not supposed to be syndicated to external business partner.

 

As shown in following diagram, a workflow entry is listed as completed. This is the automated workflow instance that ran and set the various filtered attributes including Color Code value applicable to Halogen Lamp products.


DM4.jpg

Let us see another example where “Color Designation” of Medical Lamp products are copied automatically to SP_Color-Designation using workflow and branch assignment.


DM6.jpg

As soon as “Color Designation” value is changed to Ultra Soft White, it gets automatically copied to syndicating attribute “SP_Color_Designation”.


DM7.jpg

As part of the syndication process, all the replicated attributes (listed as SP_*) can be selected to map in the export items. Depending on the Product Category, the values would appear in the export or remain empty. Syndication process is not discussed in this article as it is the standard syndication process.


SAP MDM Configuration


Product tables holds the base product attributes. UNSPSC Taxonomy defines the various category levels and category specific attributes. Attributes starting with “SP” as reserved as copy attributes to be considered for syndication to external Business Partner.


Product 1.jpg

Taxonomy is defined with various category specific attributes and their association with various categories. Linked indicates that specific attributes are associated with specific categories.


Taxonomy1.jpg

Assignments are defined for various copy attributes. As illustrated, assignments are defined for Color Designation, Color Code, Design Temperature and Resister along with Branch specific assignments.


AS1.jpg

For “Color Designation”, branch assignment can be added by right click on base assignment and selecting specific branch of specific taxonomy.


AS1-2.jpg

As illustrated, for base Color Designation assignment, the Assignment Field is set to “SP_Color_Designation”. Further define branch assignment with formula defined to copy original “Color Designation” value.


AS2.jpg

Similarly base assignment and branch assignments are defined for “Color Code”. The copy attribute corresponding to Color Code is set with value from original Color Code attribute for the products corresponding to product categories Halogen Lamp, Incandescent Lamp, Fluorescent Lamp and Medical Lamp.


AS3.jpg

Workflow is defined to execute all these assignments when either product is created or updated.

 

workflow1.JPG


workflow2.JPG



Summary


Selective filtering of Syndication attributes are often required to build immediate stop gap arrangement as a quick quality solution. Business cannot always wait till integration layer is establish and available to use. Very often such key critical syndication requirements need to be considered as part of MDM configuration. SAP MDM supports this need by using automated workflows and branch assignments.


Related Content


SAP MDM Console Guide

SAP MDM Data Manager Guide

Key Capabilities of MDM

GDS Custom attributes and usage in PI

$
0
0

SAP GDS comes with standard SAP Pre defined data model for MDM product repository.  Based on the Business needs we can always extend the data model by adding a custom field/attributes in GDS repository. Creation and Maintenance of GDS Custom Fields are defined in the SAP Note #1375813 How to extend GDS repository with custom fields.


In this document I'm trying to explain  how the GDS custom fields are handled in SAP PI- GDS Inbound and Outbound Mapping.


Types of Custom Elements in PI Relevance to GDS:


Below are the elements available in the MT_Tradeitems data type as standard GDS PI content, if any GDS custom attributes are created below elements should be used in PI mapping to send the value to GDS.


The elements are accessible under /ns0:MT_TradeItems/Payload/Item/TargetMarketData/CustomFields.


  • CustomGenericElement -
    • GDS attributes will be created directly under the table TMData>Z_<customName> in console.

 

  • CustomLanguageElement
    • These custom attributes are used to handle multi language fields in GDS qualifier tables. Make sure to map the fields Key and value in PI. Key value mapping should be maintained in GDS.

 

  • CustomLookupElement
    • if you want to maintain lookup value details this attribute will be helpful. Make sure to maintain the reference path in PI mapping. Reference path should related to GDS qualifier look up table. Key value mapping should be maintained in GDS.

 

  • CustomQualifierRecordElement
    • This attribute is used to maintain tuple fields in GDS.

 

Each custom elements will have the following attributes - below attributes should be mapped/harcoded in PI mapping as the exact name(technical name) in GDS.

 

  • tablecode
  • Fieldcode
  • Reference path - Table name should be identified/Mapped in PI by ^ and field names should be differentiate by ~

Untitled_1.png

If key value mapping is not maintained in GDS , the inbound message will ends with warning in GDS process logs. you can reprocess after maintained in GDS.

if reference path and table details are not mapped correctly in PI then the message will fail in GDS.

 

For GDS outbound - Trade items export mapping should be customized as explained in the Note #1375813. Always refer 1 sync guide for mapping the flex attributes - attr, attrmany,attrgroupMany etc..

 

Please feel free to share your thoughts and suggestions in the comments!!


Follow me on Google+


MDM 7.1 troubleshooting guide

$
0
0

This document provides list of common errors that we face in MDM 7.1 along with the troubleshooting instructions.

Target Audiences:  SAP MDM administrators.

 

Issues & Solutions:

 

SRM MDM Known Error DB

Issue

Solution

MDM Console CRC error

Check MDM console version and MDM server version, both should be at the same patch level.
i.e., MDM cosole 7.1.10 can not mount MDM server 7.1.12.

srm mdm search UI and Config UI not able to access

  1. 1. check if you are able to connect to MDM repository thru console
    2. check if respositories are in started status (nothing broken)
    3. check if slaves are in sync with master
    4. connection test; http://<SRMservername>:<port>/SMR-MDM/SRM_MDM?
    5. check if the user configured in call structure is locked.
    6. Restart MDM repositories.
    7. Restart SRM JAVA application servers.

 

 

Master repository Broken

check DB connections;
stop salves; 
stop master;
normalize master;
create slaves;
and mount the slaves to app servers

 

 

MDM respository locked

Refer SAP Note: 1663002
Table: MDM_SRM_CATCONFIG;
check for the repository Is_Locked field; if true set it false.

 

 

MDM slave broken

remove it from call structure (MDM team) if other repositories are ok;
record the port numbers;
drop/delete the slave;
recreate the slave from master;
mount it to app server;
start the slave repository;
add back to call structure (sm34);

 

 

MDM files not being consumed

check if there are any files in ready/Exception folder;
move them temp folder; copy one file to ready for testing; 
mount master app server to console;
in auxillary server mount import server;
re- key in password and save; restart MDIS server.

 

 

MDM slave sync fails

check the sync logs;
checks if master/slave port is blocked;
re-trigger sync manually;
check if master is broken for any reason;
check if slave content is edited directly by any reason; 
drop the slave
and recreate from master (repeat slave broken steps)

 

 

MDM Console throws CRC error

check MDM console & server version both should be of same support pack level;

 

 

MDM Data Import problem

Most of the times if could be some special case if the import file, which would be causing the issue;
Looks in /usr/sap/<SID>/MDIS<nn>/logs/ for error message RC 0xffaa0200 Invalid logon credentials.
check MDIS import logs for more details;
check MDM DB connections
try importing the file manually thru MDM Import manager (work along MDM consultants)

 

 

MDM Data (supplier / catalog) missing

check if they are connection to right repositories;
check if slaves are not broken;
get the inputs from MDM consultant regarding selection they are using in UI;
launch MDM Data Manager;
user the selection to filter the data
if you are able to see in Data manager console - try sync'ing slaves with master

 

 

MDM UI search - fails

get the user ID configured in call structure
check if the user is locked for any reason
check if the user has necessary authorizations

 

 

MDM console hangs while mounting servers

check DB connectivity; telnet dbhost dbport
check port 59950 on the server; telnet localhost 59950
check MDS logs for network connectivity issues
check mds.ini parameter - sap recommendations
server memory configurations (ulimit /limit)

Try updating saphostagent & MDM executables files.

 

 

MDM Master repository
structural issue (missing field ai2_**)

check if there was any alter command triggered on DB table
check for SAP notes (if any)
try restoring back by unarchiving latest archive file. (data loss possible)

 

 

SRM UI Re-Configuration

Delete the relevant repositories from the MDM Server.
Delete old config content in DB entries from MDM_SRM_CATCONFIG table related to that repository.
Unarchive new MDM repository

 

 

MDM Repository locked (SRM _MDM UI)

The table 'MDM_SRM_CATCONFIG' in the J2ee server has a parameter Is_Locked for all the repositories.
Sol: dbeaver > connect to respective DB member> sapj1 > MDM_SRM_CATCONFIG > <select the repository> > IS_LOCKED > set to false.

 

 

MDM Repository read-only mode (in MDM Data Manager)

Check the status of repository in A2I_XCAT_DBS > A2I_CATALOGS.
Check if there are any old hanging session in MDM console > MDM server > repository > Admin > Connections. If yes, try restarting the MDM server.

 

 

Repository Outdated

MDM Console >Mount server > Connect to repository > right click > update repository.
Right click > start repository > update indices.

Time Variant MDM

$
0
0

Applies to:

SAP MDM 7.1


Summary:

Time is one of the aspect that impact Product in term of its physical characteristics or in terms of perceived effectiveness of its qualities. This has a direct implication on expected revenue for Retailers or Manufacturers and also impact on the operational cost to invest in logistics and inventory management.

 

Therefore it is critical to model this business scenario while defining the Master Data Management solution for enterprise.


This paper talks about as how time variant product scenarios can be modeled using SAP MDM.

 

Author(s):

Ajay Vaidya


Company:

Tata Consultancy Services


Created on:

09 October 2015


Author Bio:

Ajay has been associated with Tata Consultancy Services (world-leading information technology consulting, services organization). He is handling the responsibility of Head of Item MDM Practice in TCS.




Introduction

 

It is well known that each product has shelf life. After the prescribed shelf life product is ceased to fit for purpose. Retail organization always aspire to manage product launching and supply chain effective to get maximum remaining shelf life of the Product in store rather than in inventory, warehouse or in transit.

 

There is one more key dimension that affects the shelf life. Some products change their physical characteristics as time passes away. Some product like season specific or event specific drastically drop their useful value as soon as the season or event is over.

 

Retailers should manage such products effectively to maximize return value.

 

Time variant products that change their physical characteristics are tricky to manage in itself. Organization information management systems should be able to model such products accordingly to reflect its time variant nature. Many organizations know the importance of these aspects but fail to manage appropriately in Information Management.


This paper talks about as how effective management of such time variant products starts from the Master Data Management using SAP MDM.

 

Time Variant Products


Time Variant products are the one that change key aspects, either physical characteristics or its useful life span.


  1. Products value drops drastically after specific time frame though its physical characteristics are intact and fit for use. Such products are typically season use products. For example, rainy products have high demand and value during monsoon season or educational high end products have high demand during back to school season. After the specific time frame, return revenue of those products drops significantly.
  2. Products that change physical characteristics and still continue to be fit for purpose. Such products are typically fresh food products like fruits. For instance, banana could take different physical characteristics and can be sold to be used in that specific physical characteristics. Green bananas product become yellow banana product with complete change in its salable view. Logistics, storage, price and purpose are completely different for green and yellow bananas.

 

Category 2 product are tricky to handle. For instance, green bananas become yellow bananas if kept in inventory for certain time period. Thus the inventory of green banana becomes zero whereas inventory stock for yellow bananas increases all of a sudden without any manual intervention. Effective supply chain and inventory management are key to ensure adequate supply of items in stores. However if such time variant products are not tracked properly with respect to time span, it could lead to non-availability of stock for some products and excess stock for other products.

 

Retail operations should take cognizant of this critical aspect and ensure that at any given point in time product identity is accurately mapped. Retail operating systems should have ability to identify this product identity transition. However it is practically difficult for every operational system to track the identity change.

 

Master Data Management Solution

 

Product Master Data Management (PIM) solution comes to rescue to handle the situation. PIM acts as single source which is reliable, integrated and trusted Product information. PIM should take the responsibility to define as how Product identity changes over the period of time.

 

There are two aspects of the solution:

  1. Tracking the time change for time variant products: This is a responsibility of Retail operations and operational systems to track the inventory and shelf time span of the specific product lot.
  2. Publishing Product identity at any specific time: This is the responsibility of PIM solution. PIM solution defines the multiple product identities and also defines as how those product identities are related to each other. For instance, PIM would define both Green Banana and Yellow Banana products. It also defines a time based rule that reveals the identity of specific product instance / lot provided that time change is tracked by the operational systems.

 

In non time variant products, typically operational system like inventory management system would refer to PIM system to discover detailed physical characteristics and other product details. For instance, PIM system would provide information as how specific product should be stored in inventory, at what temperature it should be stored, what size of packaging hierarchy it would have etc.


The same principle is followed for time variant products as well. The only difference is that the operational system like inventory management system keeps a track of time as when specific lot of Product is stocked in inventory. While referring to PIM system, the inventory management system would also provide the elapsed time since specific product is stocked along with the current product identity details (current product type). In return of this query, PIM system provides details not only about the product characteristics, but it also can provide details about change in product identity (type). For instance, as per the business rule defined in PIM system, if specific time span is already over for Green Banana to become Yellow Banana, the PIM system would provide the new product identity of Yellow banana to the original green banana identity product lot.


Operationa-PIM.jpg

In this way the responsibility to accurately identifying the product identity is split between PIM system and operational systems. Operational systems track the elapsed time span, whereas the PIM system will hold the business rule and reveal accurate product identity at given point in time.

 

SAP MDM Solution Components

 

For sake of simplicity and to illustrate the subject that is being discussed in this paper, other complexities are removed and avoided.

 

Primary Product Catalog (Food Product) holds various products. For simplicity this is considered for only food retailer purpose. Typically such catalog would have multiple types of products including food, non food, grocery, toys etc for a typical retailer. Food Product holds the details of various physical characteristics and as well as additional details of the products.

 

The Food Product is organized by Primary Product Hierarchy (Food Hierarchy). This organizes food products into various hierarchical characteristics.

 

Transformation Catalog holds the transformations to be applied on to time variant products. Transformations are associated with starting product at the beginning of transformation, target product which is a product resulting after the transformation and associated transformation rule.

 

Transformation Rule defines the rules associated with various transformations. These rules could be time variant or time independent. Natural progression is one of the type of time variant transformation rule.


Transformation Rule Type defines the taxonomy and associated category specific attributes that further defines the characteristics of specific transformation rule. Natural Progression types of rules are defined by the elapsed days after which the rule is set to trigger specific action. This action could be simply to change the “Product Type” of specific banana product lot from green to yellow bananas.


Logical Data Model.jpg


SAP MDM Configuration


For sake of simplicity, only specific components are illustrated with dummy data elements. Food Product is a catalog to maintain Food products details for ABC Organization. It holds various types of products and their associated attributes. Only limited set of attributes are defined as example.


FoodProductDM.jpg

Food Hierarchy is defined to illustrate various types of products. Third level in hierarchy is illustrated to describe various products like “Apple and Pear”, “C Vitamin Fruits”, “Organic Fruits” and “Bananas” etc. Bananas are further split into “Green Bananas” and “Yellow Bananas”.


Food Hierarchy Data.jpg


Transformation catalog holds various transactions to be applied on Food product in ABC organization. It holds reference to “Start Product” which is the product at the beginning of the transformation. Whereas “Target Product” is the product that resulted after the transformation. It also hold the reference to transformation rule which is maintained in other catalog. Other vital details are also maintained as part of the “Transformation” that includes but not limited to name of the transformation and transformation identify etc.


TransformationDM.jpg

Transformation Rules are the rules that define actual transformation. Transformation Rules are referenced by Transformations. Transformation Rule hold key basic information like Transformation Rule name and Transformation Rule ID. Transformation Rules are classified in hierarchy that defines attributes associated with specific Transformation Rules.


TransformationRuleDM.jpg

Transformation Rule Type taxonomy defines various types of rules and attributes definition associate with specific type of rules.

 

Transformation Rules are classified in various possible ways the food products can be transformed. Broadly it could be time based transformation where transformations are bound to happen after specific time span. It could be further split into natural progression of transformation or manually introduced transformations. Change in bananas from green bananas to yellow bananas are natural progression transformations. Whereas seasonal variance of price of the product is manually introduced time based transformation. Time independent transformation are one time transformations that are driven by changing business environment. It could include change of the sales geographical location of the specific product to address to certain uptick in product demand in those specific geographical locations.

 

As illustrated, “Days Life” attribute is associated with Natural Progression type. Banana rules illustrated further are mapped to this taxonomy and hence characterized by the numbers of days after that this rule is set to trigger specific action.


TransformationRuleTypeData.jpg

For example, “Banana 6” rule is type of “Natural Progression” and is defined to trigger action after 6 days of elapsed time.

 

TransformationRuleData.jpg

 

To illustrate the example, let us consider that Brand A, Brand B and Brand C bananas are defined across their green and yellow types. ABC Organization sells both green and yellow bananas for Brand A and Brand C.


BrandAGreenBananaData.jpg



Two transformations are defined for Brand A and for Brand C  to transform respective green bananas into respective brand yellow bananas. Each of these transformations are mapped to specific transformation rules. For example Brand A transformation is mapped to “Banana 6” rule. It also specifies that this transformation will transform the product from Brand A green banana to Brand A yellow banana.


BrandATransformationData.jpg



Similarly Brand C is associated with transformation which is further mapped to “banana 8” rule.


BrandCTransformationData.jpg


Defining the service is out of scope for this paper. However it would be straight forward to define a service that would reference to Food Products catalog and Transformation Rules to identify the current point in time Product type for any time variant product inventory.



Summary


Product information is split across Product Master Data Management and with various operational systems. This creates a challenge to effectively handle the time variant products that change physical characteristic of product over the period of time and results into a new salable product.  It is the responsibility of both Product Master Systems and as well operational systems in the landscape to handle time variant products.  With perfect information synchronization between these two types of systems, the time variant product scenario can be effectively handled.


Related Content

SAP MDM Console Guide

SAP MDM Data Manager Guide

Key Capabilities of MDM



Master script to manage MDM instance and repositories in linux/oracle environnement

$
0
0

We have to install MDM in my company but when i have a look on documentation bout command to start and stop script, i see :

 

MDM_script.png

 

 

So I am not very happy with this

 

So i have devellop one :

 

mdmadm> ksh mdm_master help

usage: mdm_master action [option1] [option2]

action can be start, stop, status, repair, backup

 

#### start ####

if "mdm_master  start" without option, it will ask you confirmation before starting/mounting each repository.

if "mdm_master start ALL", it will start everything without asking confirmation.

if "mdm_master start ALL MOUNT", it will start everything and just mount repositories (no start on them).

 

#### stop ####

if "mdm_master  stop" without option, it will ask you confirmation before stopping/unmounting each repository.

if "mdm_master stop ALL", it will stop everything without asking confirmation.

if "mdm_master stop ALL MOUNT", it will just stop repositories and let them mounted.

 

#### repair ####

if "mdm_master  repair" without option, it will ask you confirmation before checking/repairing each repository.

if "mdm_master repair ALL", it will check/repair everything without asking confirmation.

 

#### backup ####

  if "mdm_master backup" without option, it will  backup of each repository without confirmation.


You need to configure config file for get some variable setup :

ORACLE_USER=

ORACLE_PASSWORD=

ADMIN_PASS_REPO=

DBHOST=

DBPORT=

ORACLE_HOME=

ORACLE_SID=

OS_USER=

OS_PASSWORD=

SYS_NUMBER_MDS=

SYS_NUMBER_MDIS=

SID_MDM=

LD_LIBRARY_PATH=/usr/sap/${SID_MDM}/MDS${SYS_NUMBER_MDS}/exe:$LD_LIBRARY_PATH:/usr/sap/${SID_MDM}/MDIS${SYS_NUMBER_MDIS}/exe

LOGICAL_HOSTNAME=

As you can see, even if you have several repositories, script need only one password. In fact, if the Admin's password is not the password define in config file, script will setup it to the password define in the config file with clix repEmergencyAdminUserSetPassword command.


Please find config file and script  here :

mathieugravil-coding/shell/SAP/MDM at master · mathieugravil/mathieugravil-coding · GitHub


Some example :

mdmadm> ksh mdm_master start

ZJ2 is UP

SAP service /usr/sap/MDM/MDS10/exe/sapstartsrv is UP

SAP service /usr/sap/MDM/MDIS11/exe/sapstartsrv is UP

mds.sapMDM_MDS10 is UP

mdis.sapMDM_MDIS11 is UP

Used ports: 2015 2016 2017 2012 2013 2014 2009 2010 2011 2006 2007 2008 2003 2004 2005 2000 2001 2002

Repo to be mounted:

Repo already mounted but to be started: SRM_MDMTEST_REP SRM_MDM_DEV01_MAIN_MAS SRM_MDM_DEV_SRM7_MAS SRM_MDM_FIORI_DEV SRM_MDM_FIORI_QUAL TEST_TEMP

 

 

SRM_MDMTEST_REP SRM_MDM_DEV01_MAIN_MAS SRM_MDM_DEV_SRM7_MAS SRM_MDM_FIORI_DEV SRM_MDM_FIORI_QUAL TEST_TEMP

o Starting of repository :

Are you sure you want to start SRM_MDMTEST_REP  (yes/no)?yes

o I will try to start SRM_MDMTEST_REP :

  OK: repStart is done on SRM_MDMTEST_REP

Are you sure you want to start SRM_MDM_DEV01_MAIN_MAS  (yes/no)?no

Nothing done

Are you sure you want to start SRM_MDM_DEV_SRM7_MAS  (yes/no)?no

Nothing done

Are you sure you want to start SRM_MDM_FIORI_DEV  (yes/no)?no

Nothing done

Are you sure you want to start SRM_MDM_FIORI_QUAL  (yes/no)?no

Nothing done

Are you sure you want to start TEST_TEMP  (yes/no)?no

Nothing done

 

 

mdmadm> ksh mdm_master status

o STATUS OF DB :

ZJ2 is UP

 

 

o STATUS OF SAPSERVICES :

SAP service /usr/sap/MDM/MDS10/exe/sapstartsrv is UP

SAP service /usr/sap/MDM/MDIS11/exe/sapstartsrv is UP

 

 

o STATUS OF MDM SERVER  :

mds.sapMDM_MDS10 is UP

 

 

o STATUS OF MDM IMPORT SERVER  :

mdis.sapMDM_MDIS11 is UP

 

 

o STATUS OF REPOSITORIES  :

SRM_MDMTEST_REP # has status #STOPPED# and it use port #2000#.

SRM_MDM_DEV01_MAIN_MAS # has status #STOPPED# and it use port #2003#.

SRM_MDM_DEV_SRM7_MAS # has status #STOPPED# and it use port #2006#.

SRM_MDM_FIORI_DEV # has status #STOPPED# and it use port #2009#.

SRM_MDM_FIORI_QUAL # has status #STOPPED# and it use port #2012#.

TEST_TEMP # has status #STOPPED# and it use port #2015#.

 

 

 

mdmadm>  ksh mdm_master backup

BACKUP

o WARNING : Doing  backup on STARTED repo, could be not a good idea, but i will try to do it  for SRM_MDMTEST_REP :

o I will try to backup SRM_MDMTEST_REP in /usr/sap/MDM/MDS10/mdm/archives/SRM_MDMTEST_REP_160209.a2a :

  OK: cpyArchive is done on SRM_MDMTEST_REP

o I will try to backup SRM_MDM_DEV01_MAIN_MAS in /usr/sap/MDM/MDS10/mdm/archives/SRM_MDM_DEV01_MAIN_MAS_160209.a2a :

  OK: cpyArchive is done on SRM_MDM_DEV01_MAIN_MAS

o I will try to backup SRM_MDM_DEV_SRM7_MAS in /usr/sap/MDM/MDS10/mdm/archives/SRM_MDM_DEV_SRM7_MAS_160209.a2a :

  OK: cpyArchive is done on SRM_MDM_DEV_SRM7_MAS

o I will try to backup SRM_MDM_FIORI_DEV in /usr/sap/MDM/MDS10/mdm/archives/SRM_MDM_FIORI_DEV_160209.a2a :

  OK: cpyArchive is done on SRM_MDM_FIORI_DEV

o I will try to backup SRM_MDM_FIORI_QUAL in /usr/sap/MDM/MDS10/mdm/archives/SRM_MDM_FIORI_QUAL_160209.a2a :

  OK: cpyArchive is done on SRM_MDM_FIORI_QUAL

o I will try to backup TEST_TEMP in /usr/sap/MDM/MDS10/mdm/archives/TEST_TEMP_160209.a2a :

  OK: cpyArchive is done on TEST_TEMP

 

 




Enjoy !!!

Don't hesitate to make remarks !!

Viewing all 38 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>