Quantcast
Channel: SCN : Document List - SAP NetWeaver Master Data Management
Viewing all 38 articles
Browse latest View live

Featured Content in SAP NetWeaver Master Data Management

$
0
0

Join Customer Survey on SAP NetWeaver Master Data Management

Please help to continuously enhance SAP NetWeaver Master Data Management and take part in the current customer survey. 03 December 2013

 

Combining SAP NetWeaver MDM with SAP Lumira

Read this blog to find out how you can visualize master data information for analytical purposes using SAP Lumira. 20 November 2013

 

Benefits from SAP NetWeaver MDM 7.1 SP10!

Read Menachem Melamed's blog to find out about the benefits of SAP NetWeaver MDM Support Package 10. 28 March 2013

Benefits from SAP NetWeaver MDM 7.1 SP09!

Read Menachem Melamed's blog to find out about the benefits of SAP NetWeaver MDM Support Package 9. 03 Sep 2012

 

Benefits from SAP NetWeaver MDM 7.1 SP08!

Read Steffen Ulmer's blog to find out about the benefits of SAP NetWeaver MDM Support Package 8. 07 Mar 2012


SAP MDM Functional Consulting

$
0
0

 

 

 

 

 

 

SAP MDM Functional Consulting

 

An Approach Paper

 

By Shahid Noolvi

 

Table of Contents

 

/doc-create!input.jspa?container=2014&containerType=14&documentID=&tags=&subject=#_Toc299008954

          1. Abstract. 3
  1. 2. SAP MDM Functional Consulting.. 4
  2. 3. The Problem Statement and solutions considered.. 5
  3. 4. Summary/Conclusion.. 7
  4. 5. Reference.. 8
  5. 6. About the author.. 9
  6. 7. Glossary.. 10


1. Abstract

      The Objective of this White Paper shall be to help Client companies, IT service providers, project managers and SAP MDM Implementation team on the roles and responsibilities of an SAP MDM Functional Consultant and how it would help in long term to have a very efficient Functional Consulting done at the time of implementation irrespective of domain and technology. I often hear people asking to Master data experts “are you MDM technical or functional?” This question always tricks me as not many companies even think of hiring a SAP MDM Functional Consultant at the time of implementation.

An SAP MDM Technical Consultant would be the one who deals with all stuff as extraction of data, and operations in console, IM, DM & SM.

The Work of an SAP MDM Functional Consultant shall actually begin much before the implementation of masters is decided. The Functional Consultant firstly depends on Master data object, and it must be one who knows master data from domain and system perspective irrespective of technology.

For Example: for Material Master and Vendor Master Object it can be MM consultant for Customer master object it can be SD Consultant.

 

Majority of the IT managers are concerned about the maintaining their existing systems and their process colleagues are focused on business value, both look at the bottom line and the time to deliver promises. The MDM projects we start only go as far as consolidating and harmonizing data. To centrally manage data you have to have concrete business rules, this takes a long time, effort and expense. With another system to maintain and support, no quick wins to show off the project to date was viewed as an expensive data cleansing exercise and goes on hold. Majority of these happen due to lack of proper consulting and approach towards the implementation of the project. A Technical Consultant may provide solutions with respect to consolidating and harmonizing data, these are integral part of the MDM application capabilities. However, when it comes to the wider aspect of managing the master data centrally, strong understanding of the business processes and why, where and in which application, which part of  the master data object is desired and required has to be understood before thinking of providing a solution.

For E.g.: In telecom Industry, the billing applications, may not exactly require the banking information of the customer, where as the same is very much required in the commission disbursement system and also the ERP system.

The ERP Application used for day today transactions may need more than just master data information to be created at the time of creation of master data.

For E.g.: Maintaining J1ID table for the tax related data shall not be mandatory at the time of material creation however if not done may hamper PO creation process as the proper tax calculation may not be taken care in the PO 

 

 

 

 

2. SAP MDM Functional Consulting

 

The SAP products that are doing the best in today’s market are either absolutely essential to business function (like financials) or they offer an immediate, tangible benefit (like ESS in HR or BI that results in better reporting). Perhaps for MDM the challenge is that companies don’t generally think of it in terms of rollouts that could gain ROI momentum for something bigger. They tend to see MDM as an overhaul of there data, and that is where the buy-in gets to be more of a challenge. This makes the need for MDM consultants to understand the business process and master data of the client company before the implementation.

 

An MDM project implementation must include guys who understand the business landscape fully and correctly as well as study the business masters involved in the project. The Functional Consultant shall be responsible for this and must specifically be looking in to this area to understand how the functional requirement maps with the MDM repository.
Blue printing and designing & modeling will be taken care by the Functional Consultant of master data along with a person who will design the repository in real time. The SAP MDM Functional Consultant is responsible for interacting with senior business and IT client contacts to provide functional expertise. This role will drive Master Data Strategy, Data Governance/Standardization, Master Data Process Design, High Level MDM Architecture and the Implementation Roadmap defined by the business case.

 

Testing is another phase where the MDM Functional Consultant would be required to play a major role. Every aspect and field that has been decided to be included in the MDM Application needs to be tested for integration, migration and performance, while these aspects can be tested by the technical consultant, the functional consultant adds value to check whether the master data created actually providing proper data across systems and transactions are smoother with the new approach of master data creation.

 

 

 

 

 

3. The Problem Statement and Solutions Considered

 

Master Data Management, more than an application we implement a concept or a process change. Whenever there is a new application being implemented, the team includes the functional expertise in the area and also the technical team with the knowledge of the application for successful implementation of the project. MDM being an application on the SAP NetWeaver Platform the application since its introduction has been tagged as a technical implementation. However many managers now understand that MDM implementation is more process oriented than only the tool that is used by implementing MDM. SAP regardless to its marketing has been able to portray more advantages by use of its application in the field of MDM. As we discussed earlier, MDM is more of process oriented implementation, the understanding of the Master data and its need in the business processes becomes very important. The functional consultant plays an important role in the business blue printing, process study, process alignment and the implementation of all these through the application.

 

Having understood the problem of MDM being categorised as an tool for technical implementation only, let us also understand as to why the need has been overlooked by majority of the customer and the impact of the same.

 

We discussed that SAP MDM is a part of NetWeaver component. Let us try to understand what is SAP NetWeaver? Let us have a look at Wikipedia’s definition for SAP NetWeaver:

“SAP NetWeaver is SAP's integrated technology computing platform and is the technical foundation for many SAP applications since the SAP Business Suite. SAP NetWeaver is marketed as a service-oriented application and integration platform. SAP NetWeaver provides the development and runtime environment for SAP applications and can be used for custom development and integration with other applications and systems. SAP NetWeaver is built using primarily the ABAP programming language, but also uses C (programming language), C++, and Java EE. It also employs open standards and industry de facto standards and can be extended with, and interoperate with, technologies such as Microsoft.NET, Java EE, and IBMWebSphere.”

 

SAP NetWeaver includes SAP MDM, SAP EP, SAP BI, SAP MI, and SAP IDM. This may have added more to this belief that MDM is a technical application more like the other NetWeaver modules of SAP. The fact that MDM cannot be implemented without functional understanding of the processes, business impact of master data and also the application is ignored here. There can be serious impact of not having understood MDM as a concept. Master data management is more than just consolidation, harmonisation, and synchronisation of the master data. The correct understanding of whether the data actually has to undergo all these processes is a challenge. Apart from this cost can also be considered as one of the reasons.

For E.g.:- The Technical team can generate a report to enable users understand duplicate records in the existing system, however how to arrive at the conclusion that a field has to be included in the strategy for identifying the duplicates is the true challenge. Based on industry the fields for different master data objects may change, identifying this requires in depth knowledge of the industry and master data objects in question.

 

 

Master data management can help an enterprise to reduce costs, and can result in clean and efficient data that better support business processes and key requirements, such as meeting compliance standards. But getting master data management software up and running without exactly understanding the purpose can actually end in improper handling of master data. Looking at the example above, we have understood how the identification of duplicates be handled with help of a functional consultant, without which we may be actually identify duplicates with the wrong set of fields which can have huge impact on the business, this can be serious as data loss, as majority of MDM applications look forward at erasing duplicate entries or merging them into one entry.

 

Now we have understood the problem and looking at the solution for the same, we need to apply certain best practices that are to be implied during the SAP MDM project implementation. To begin with let us try to understand what all are the basic capabilities to be possessed by the functional consultant. The functional consultant role will be to drive Master Data Strategy, Data Governance/Standardization, Master Data Process Design, High Level MDM Architecture and the Implementation Roadmap defined by the business case. The consultant shall be highly involved in a mix of client advisory, quality / health check, management of 3rd party vendors and management of senior stakeholders. The functional consultant is a client facing role and shall require highly intelligent and articulate individuals. The consultant will also have to train the end user and make them understand why the organization has undergone this process change and what the benefits are for the same. Documentation is also a very important part of a project which has to be completed by the functional consultant. Apart from these areas once the actual project implementation begins, the SAP MDM Functional Consultant shall be capable enough to do the data modeling, prepare the data flow diagrams, high level solution architecture has to be defined by the SAP MDM Functional consultant. Other Responsibilities shall include conducting Implementation Analysis in respective area, Deliver high level System Requirements and Functional Design, and work with Test Manager to define appropriate levels of testing, work with Technical Consultants to agree Technical Designs, Works with Security to ensure appropriate roles built.

To address the issue related to costs, a single functional expert, from the area of SCM who is familiar with the Master data objects and process related to SCM shall be the best possible option for this particular role. Other attributes shall be evaluated based on experience and the desired position.

Client organization must be specifically aware of what they intend to implement and based on the need IT managers can put forward consultants with appropriate knowledge set. The functional consultant shall be more of an advisory help for the client to understand what actually can benefit the organization and whether a master data object needs an external application to handle the same.

 

 

 

 

4. Summary/Conclusion

 

There is a clear purpose to include the Functional consultant in every implementation or support project. MDM is no exception for this. Being more of a technical module the need has however been minimized or we can use the word ignored. The discussion on this paper shall make the decision easier and justify the need of an SAP MDM Functional Consultant to analyze the impact of MDM on the Master data object creation process, its need and implementation. Training to the end user and the testing of the application add to the purpose. Proper data modeling and good solution architect during the implementation will help the organizations to reduce the cost of maintenance and enhance the use of the product to its core.

 

While we say for Material Master and Vendor Master Object the MDM Functional Consultant can be MM consultant for Customer master object it can be SD Consultant, a SCM Process Consultant who has expertise on the Processes for the Procure to pay process can take lead for the task for all the master data objects. Moreover in today’s competitive environment, we can also find Techno-Functional (A Functional Consultant trained on the technical tool capabilities and configuration) consultant to achieve more optimum productivity.

 

5. Reference

 

http://www.careerjet.sg

http://www.cwjobs.co.uk

http://scn.sap.com

http://service.sap.com

http://www.jonerp.com

 

6. About the author

           Shahid Noolvi, A MBA graduate trained on SAP MM and SAP MDM, with 4 years of experience is SAP Project implementations in the areas of SAP MM and SAP MDM. Shahid has work experience in SAP CoE Team for a Indian Telecom Operator, has support experience to the SCM Team in all SAP transactions, handling Change requests and training to the end users for new processes implemented.

Shahid Noolvi is graduated in Company Secretaryship from Karnatak University Dharwad, Post Graduation in Masters of Business Administration. During his tenure with the corporates he has gained professional expertise in SAP Functional modules for Materials Management and SAP MDM areas and has been thus very efficient in handling business requirement.

Shahid has a professional expertise in the areas of SAP Project Management, SAP MM Module, SAP Netweaver MDM, SAP TAO, SAP RWD Documentation tools and he is also certified for Six Sigma Yellow belt and Green Belt project implementations

Looking forward Shahid intends to be moving into the new modules from SAP like the SAP GTS platform and work extensively in the area of Customer Consulting and Solution Architecture positions.

In his tenure with the earlier company Shahid has been awarded with the Award of Appreciation for Solution Design and developments during GSM Roll out activity for the telecom major. Shahid has also been awarded with Tata Innovative Team Leadership Initiative Award for Innovative Ideas for cost saving in the South Region.

 

 

7. Glossary

IT – Information Technology

SAP MDM – SAP Master Data Management is a tool provided by SAP to Manage the Master Data of the organization, it includes both SAP and Non SAP Master data.

IM – Import Manager is a client available with SAP MDM, this is used to import data from external sources into MDM.

DM - Data Manager is a Client where the actual data is being managed

SM – Syndication Manager is a client available with SAP MDM, this is used to Export data from MDM into external applications.

ROI – Return of Investment

ESS – Employee Self Services is an application with SAP Enterprise portal where in the employees can manage there leave, attendance, perform other employee related activities.

 

CoE – Center of Excellence is the team formed to implement manage or analyse projects.

 

SCM – Supply Chain Management handles the Procurement to pay process in the organizations.

 

J1ID – it is a table in SAP where the Excise duty information is maintained visa-vis a Material

 

PO – Purchase Order is a document in SAP created to procure orders.

 

SAP EP – SAP provided Enterprise portal

 

SAP MI – SAP Mobile interface

 

SAP IDM – SAP provided Identity Manager

External Enrichment - Master Data Enrichment Approach

$
0
0

 

Introduction

SAP MDM (Master Data Management) is a tool used for maintenance and management of Master Data for an Organization across geographies. In other words we can even explain MDM as an Enrichment Tool, which enriches the Business data in order to provide correct and reliable data to the Business for better Business decisions, to reduce fault resolution time, to speed up data processing and much more.

 

But still there are questions that needs to be answered for effective decision making by the Business which cannot be directly answered by SAP Master Data Management tool, however, this tool defines a baseline by defining an Integration Model that would help you to get the answers of Business specific questions before deciding on if this particular data needs to be further maintained in the central hub or not.

 

This article revolves around:

 

  • Concept of External Enrichment
  • How SAP MDM Integrates with Third Party Enrichment Tools
  • Key Benefits with this approach.

 

Need for External Enrichment

MDM always plays a vital role, when it comes to Reduction of Duplicate data or Mastering the Data for a Business with respect to there Master objects like Products, Vendors, Customers, Materials etc., but there are scenarios when the Business needs further Enrichment if not in terms of core master and transactional data but Data related to the Organization the Company is dealing with, for e.g.-

 

 

  • Will my customer pay me on time or will he pay at all?
  • What is the financial condition of my business partner? Are his financial statements running into red?
  • Will my supplier deliver on time?
  • Is my business partner reliable enough to business with?
  • Is my business partner involved in lawsuits or has any background that would hamper my business?

 

 

Getting answers to all these questions is what you call “Risk Management”. However, these questions cannot be directly answered by MDM, but SAP Master Data Management can be modeled in such a way that before storing any Master Data like Vendor, Material, Customer etc, it can ensure to get the complete information about the data that you are using and help the Organization to make efficient decision. These kinds of Enrichments are not possible using MDM, however, to manage risks effectively it is imperative to have all this data consolidated for a particular master data. But with a diverse system landscape and duplicate customers and vendors existing in various parts of organizations; assessing risk becomes a very big challenge. Inaccurate risk management would lead not only to bad financial decisions but also may hamper your relations with your vendors and customers.

 

 

External Enrichment

 

 

External enrichment services are provided either remotely by a service provider or by locally installed software. They typically offer very specialized and sophisticated enrichment functionality for certain kinds of master data.

 

There are various Service Providers in market with which an Organization can sign up a deal such as DnB, Acxiom, Intermat, Zycus and JP Morgan etc where you can send the Master Data related basic information and end – to – end information (which is accurate and clean) would be sent back by the Service providers. Some providers like DnB also maintain scores and ratios on various risk aspects which are calculated by them. Of course all of these wont come for free!

 

Consider the example of enriching customer address data. We have third party service providers like Dun & Bradstreet (DnB) which provide industry standard, accurate information on global entities. Given a small set of information like DUNS number, Name, City, Country etc DnB replies with a plethora of information like the World base Package containing the hierarchy of organization, its specific details like email, phone, website and other address details.

 

Using SAP MDM and Enrichment Controller provided by SAP, we can send the request as a web service to DnB and then enrich our repository records with information returned by DnB. This increases the data quality enormously! Errors such as typos, spelling, incomplete information etc do not take place as the data provided by DnB is standard and accurate.

 

Similarly other web services can be invoked from SAP MDM like VAT number, Latitude& Longitude etc. Use of such enrichment architecture can ensure high quality master data.

 

 

                                    Untitled.png

 

 

Enrichment Controller

 

Enriching your business partner master data with necessary information is the key to your Risk Management initiative. You can use a single third party data provider or opt for multiple providers to give you specific information on business partners. Requesting and gathering information is one part but most important is to use it effectively to make better decisions. Using the Enrichment Architecture, SAP MDM provides a Service Oriented Approach to easily connect and communicate with third party providers. Also now as all the data is consolidated with SAP MDM acting as a central hub, it facilitates better risk management.

 

All the steps of consolidating, cleansing and de-duplicating would come handy while enriching as you would now be able to ask for the exact Master Data related information and your request details would be more accurate. This would make it simpler for the 3rd party providers to send you the required details.

 

 

                                New Picture1.png

As shown above, SAP MDM comes with an Enrichment Controller which helps maintain communication with 3rd party service providers. The steps to be followed are:

 

 

  • SAP MDM syndicates XML file to Enrichment Controller.
  • Enrichment Controller will forward the file to 3rd party service provider.
  • 3rd party service provider will analyze the data and send the information requested.
  • SAP MDM will import the XML file using Import Manager and update data of the specific Master data object.

Conclusion

Using the Enrichment at Organizational level helps the Master Data Steward to get the Correct and Accurate data without any discrepancies.

 

Some of the key business benefits of using External Enrichment are:

 

  • Single face of business partner; irrespective of whether it’s a customer or vendor.
  • Reduction of Fault Resolution time.
  • Reduction of Overhead cost due to incorrect data.
  • Efficient management due to enriched master data and hence better decision making.
  • Compliance Adherence as a result of avoidance of “unwanted business partner relations”.
  • Accurate reporting by BI systems due to consolidated business partner single source of data.
  • Increase in business revenue and better relationships with your customers/vendors by analyzing them as business partners across geographies and systems thereby able to offer them better financial rates and terms.

Getting started with MDM Java API

$
0
0

Applies to:

SAP Netweaver Master Data Management, as of MDM 7.1. For more information, visit the Master Data Management homepage.

 

Summary

The main purpose of this article is to provide code snippets to get started with MDM Java API, which is a very flexible way of utilizing SAP MDM features. This article provides code snippets which can be directly run in your IDE or from Command Prompt.

 

Author:         Goutham Kanithi

Company:    Mahindra Satyam

Created on:  10 June 2012

 

Author Bio  

untitled2.JPG

 

 

Goutham Kanithi has been associated with Mahindra Satyam for 18 months and has been a part of MDM practice. He is skilled in SAP MDM, Java, J2EE and completed his Bachelor’s degree in Electronics and Communication Engineering

 

 

 

 

 

 

  

Table of Contents

 

1          Business Scenario

2          Prerequisites

3          Code Snippets

     3.1        To print all the running repositories in a MDM serve

     3.2        To print all the tables in a repository

     3.3        To print all the fields in a table.

     3.4        Retrieve display field data of a table.

     3.5        To retrieve selected field data from Table.

     3.6        To store whole table (integer, text, lookup) data from a table in a multidimensional array.

4          Related Content

 

1.    Business Scenario

 

The MDM Java API is an application programming interface for the MDM software. It allows MDM customers to write customized applications for interacting with the MDM server. The MDM Java API is one of the powerful suite of MDM tools for next-generation master data management.

This article is a starting point to get started with MDM Java API. Sample code from start to end which can be directly executed.

 

2       Prerequisites

 

·         The MDM Java API should be downloaded from SAP Market place as shown below

untitled.JPG

·         The MDM Java API libraries (JAR files) have to be referenced by the application at design time for compilation, and available at runtime for execution.

·         The MDM Java API build version and MDM Server build version should match. The build version of MDM Java API can be checked as shown below

untitled3.JPG

 

 

3       Code Snippets

 

3.1       To print all the running repositories in a MDM server

 

The below code retrieves the running repositories in a given MDM server

 

 

package test;

 

import com.sap.mdm.commands.CommandException;

import com.sap.mdm.commands.CreateServerSessionCommand;

import com.sap.mdm.commands.GetRunningRepositoryListCommand;

import com.sap.mdm.net.ConnectionException;

import com.sap.mdm.net.SimpleConnection;

import com.sap.mdm.net.SimpleConnectionFactory;

import com.sap.mdm.server.RepositoryIdentifier;

import com.sap.mdm.session.SessionException;

 

public class Main {

 

                static public void main(String[] args) throws SessionException,

                                                ConnectionException, CommandException {

 

                                String mdsName = "100.10.100.100"; // IP address or name of the Master Data Server

 

                                SimpleConnection conAccessor = SimpleConnectionFactory

                                                                .getInstance(mdsName);

                                // Create a user session

                                CreateServerSessionCommand servSessCmd = new CreateServerSessionCommand(

                                                                conAccessor);

                                try {

                                                servSessCmd.execute();

                                } catch (CommandException e1) {

                                                // TODO Auto-generated catch block

                                                e1.printStackTrace();

                                }

 

                                GetRunningRepositoryListCommand repositoryList = new GetRunningRepositoryListCommand(

                                                                conAccessor);

                                repositoryList.execute();

                                RepositoryIdentifier[] repositories = repositoryList.getRepositories();

 

                                for (int w = 0; w < repositories.length; w++)

                                                System.out.println(repositories[w].toString());

 

                                conAccessor.close();

 

                }

}

      

3.2       To print all the tables in a repository

The below code retrives the tables from a given repository

 

 

package test;

 

import com.sap.mdm.commands.CommandException;

import com.sap.mdm.net.ConnectionException;

import com.sap.mdm.schema.RepositorySchema;

import com.sap.mdm.schema.commands.GetRepositorySchemaCommand;

import com.sap.mdm.session.RepositorySessionContext;

import com.sap.mdm.session.SessionException;

import com.sap.mdm.session.SessionManager;

import com.sap.mdm.session.SessionTypes;

import com.sap.mdm.session.UserSessionContext;

 

public class Main {

 

                static public void main(String[] args) throws SessionException,

                                                ConnectionException, CommandException {

 

                                String mdsName = "100.10.100.100"; // IP address or name of the Master Data Server

                                String repositoryName = "Products"; // name of the repository

                                String regionName = "English [US]";

                                String userName = "Admin"; // enter a valid username

                                String password = ""; // enter the password for the above username

 

                                UserSessionContext context = new UserSessionContext(mdsName,

                                                                repositoryName, regionName, userName);

                                // Get an instance of the session manager

                                SessionManager sessionManager = SessionManager.getInstance();

                                // Create a user session

                                String ses = sessionManager.createSession(context,

                                                                SessionTypes.USER_SESSION_TYPE, password);

 

                                GetRepositorySchemaCommand grsc = new GetRepositorySchemaCommand(

                                                                (RepositorySessionContext) context);

                                grsc.setSession(ses);

                                grsc.execute();

                                RepositorySchema schema = grsc.getRepositorySchema();

 

                                String[] tcodes = schema.getTableCodes();

                                for (int i = 1; i < tcodes.length; i++) {

                                                System.out.println("Table name is: "

                                                                                + schema.getTable(tcodes[i]).getName().get()

                                                                                + " and table code is: " + tcodes[i]);

                                }

 

                                // destroy the session and close the connection to the MDS

                                sessionManager.destroySession(context, SessionTypes.USER_SESSION_TYPE);

 

                }

 

}

 

 

3.3       To print all the fields in a table

 

package test;

 

import com.sap.mdm.commands.CommandException;

import com.sap.mdm.net.ConnectionException;

import com.sap.mdm.schema.FieldProperties;

import com.sap.mdm.schema.RepositorySchema;

import com.sap.mdm.schema.commands.GetRepositorySchemaCommand;

import com.sap.mdm.session.RepositorySessionContext;

import com.sap.mdm.session.SessionException;

import com.sap.mdm.session.SessionManager;

import com.sap.mdm.session.SessionTypes;

import com.sap.mdm.session.UserSessionContext;

 

public class Main {

 

                static public void main(String[] args) throws SessionException,

                                                ConnectionException, CommandException {

 

                                String mdsName = "100.10.100.100"; // IP address or name of the Master Data Server

                                String repositoryName = "Products"; // name of the repository

                                String regionName = "English [US]";

                                String userName = "Admin"; // enter a valid username

                                String password = ""; // enter the password for the above username

 

                                UserSessionContext context = new UserSessionContext(mdsName,

                                                                repositoryName, regionName, userName);

                                // Get an instance of the session manager

                                SessionManager sessionManager = SessionManager.getInstance();

                                // Create a user session

                                String ses = sessionManager.createSession(context,

                                                                SessionTypes.USER_SESSION_TYPE, password);

 

                                GetRepositorySchemaCommand grsc = new GetRepositorySchemaCommand(

                                                                (RepositorySessionContext) context);

                                grsc.setSession(ses);

                                grsc.execute();

                                RepositorySchema schema = grsc.getRepositorySchema();

 

                                FieldProperties[] fields = schema.getFields("MDM_Products");//enter the table code

                                for (int i = 1; i < fields.length; i++) {

 

                                                System.out.println("Field name : " + fields[i].getName().get()

                                                                                + " and Field code : " + fields[i].getCode());

                                }

 

                                // destroy the session and close the connection to the MDS

                                sessionManager.destroySession(context, SessionTypes.USER_SESSION_TYPE);

 

                }

 

}

 

 

3.4       Retrieve display field data from main table

The below code retrieves the display field of each record from the main table.

 

package test;

 

import com.sap.mdm.data.Record;

import com.sap.mdm.data.RecordResultSet;

import com.sap.mdm.data.ResultDefinition;

import com.sap.mdm.data.commands.RetrieveLimitedRecordsCommand;

import com.sap.mdm.ids.TableId;

import com.sap.mdm.net.ConnectionException;

import com.sap.mdm.search.Search;

import com.sap.mdm.session.SessionException;

import com.sap.mdm.session.SessionManager;

import com.sap.mdm.session.SessionTypes;

import com.sap.mdm.session.UserSessionContext;

 

public class Main {

 

                static public void main(String[] args) throws SessionException,

                                                ConnectionException {

 

                                String mdsName = "100.10.100.100"; // IP address or name of the Master Data Server

                                String repositoryName = "Products"; // name of the repository

                                String regionName = "English [US]";

                                String userName = "Admin"; // enter a valid username

                                String password = ""; // enter the password for the above username

 

                                UserSessionContext context = new UserSessionContext(mdsName,

                                                                repositoryName, regionName, userName);

                                // Get an instance of the session manager

                                SessionManager sessionManager = SessionManager.getInstance();

                                // Create a user session

                                sessionManager.createSession(context, SessionTypes.USER_SESSION_TYPE,

                                                                password);

 

                                TableId mainTableId = new TableId(1);

                                ResultDefinition definition = new ResultDefinition(mainTableId);

                                Search search = new Search(mainTableId);

                                RetrieveLimitedRecordsCommand command = new RetrieveLimitedRecordsCommand(

                                                                context);

                                command.setResultDefinition(definition);

                                command.setSearch(search);

                                try {

                                                command.execute();

                                } catch (Exception e) {

                                                e.printStackTrace();

                                                return;

                                }

                                RecordResultSet recordResultSet = command.getRecords();

                                Record[] records = recordResultSet.getRecords();

                                for (int i = 1; i < records.length; i++) {

                                System.out.println("record number " + i + " is " + records[i].getDisplayValue().toString());

                                }

// destroy the session and close the connection to the MDS

                                sessionManager.destroySession(context, SessionTypes.USER_SESSION_TYPE);

 

                }

 

}

 

3.5       To retrieve selected field data from Table

 

package test;

 

import com.sap.mdm.commands.CommandException;

import com.sap.mdm.data.Record;

import com.sap.mdm.data.RecordResultSet;

import com.sap.mdm.data.commands.RetrieveLimitedRecordsCommand;

import com.sap.mdm.extension.data.ResultDefinitionEx;

import com.sap.mdm.ids.FieldId;

import com.sap.mdm.ids.TableId;

import com.sap.mdm.net.ConnectionException;

import com.sap.mdm.schema.RepositorySchema;

import com.sap.mdm.schema.commands.GetRepositorySchemaCommand;

import com.sap.mdm.search.Search;

import com.sap.mdm.session.RepositorySessionContext;

import com.sap.mdm.session.SessionException;

import com.sap.mdm.session.SessionManager;

import com.sap.mdm.session.SessionTypes;

import com.sap.mdm.session.UserSessionContext;

 

public class Main {

 

                static public void main(String[] args) throws SessionException,

                                                ConnectionException, CommandException {

 

                                String mdsName = "100.10.100.100"; // IP address or name of the Master Data Server

                                String repositoryName = "Products"; // name of the repository

                                String regionName = "English [US]";

                                String userName = "Admin"; // enter a valid username

                                String password = ""; // enter the password for the above username

 

                                UserSessionContext context = new UserSessionContext(mdsName,

                                                                repositoryName, regionName, userName);

                                // Get an instance of the session manager

                                SessionManager sessionManager = SessionManager.getInstance();

                                // Create a user session

                                String ses = sessionManager.createSession(context,

                                                                SessionTypes.USER_SESSION_TYPE, password);

 

                                GetRepositorySchemaCommand grsc = new GetRepositorySchemaCommand(

                                                                (RepositorySessionContext) context);

                                grsc.setSession(ses);

                                grsc.execute();

                                RepositorySchema schema = grsc.getRepositorySchema();

                                String tname = "MDM_Products", selectFields[] = { "Id", "Material_Number",

                                                                "Description" };

                                TableId mainTableId = schema.getTableId(tname);

                                ResultDefinitionEx resDef = new ResultDefinitionEx("Products", context);

                                FieldId fid[] = new FieldId[selectFields.length];

                                for (int i = 0; i < selectFields.length; i++) {

                                                fid[i] = schema.getFieldId(tname, selectFields[i]);

                                                resDef.addSelectField(selectFields[i]);

                                }

 

                                resDef.setLoadAttributes(true);

 

                                Search search = new Search(mainTableId);

                                RetrieveLimitedRecordsCommand command = new RetrieveLimitedRecordsCommand(

                                                                context);

                                command.setResultDefinition(resDef);

                                command.setSearch(search);

                                command.setSession(ses);

 

                                try {

                                                command.execute();

                                } catch (Exception e) {

                                                e.printStackTrace();

                                }

                                RecordResultSet recordResultSet = command.getRecords();

                                Record[] records = recordResultSet.getRecords();

 

                                for (int a = 0; a < fid.length; a++) {

                                                System.out.println("data in the field " + selectFields[a]);

                                                for (int b = 0; b < records.length; b++) {

                                                                System.out.println("record no. " + b + "- "

                                                                                                + records[b].getFieldValue(fid[a]).toString());

                                                }

                                }

 

                }

}

 

3.6       To store whole table (integer, text, lookup) data from a table in a multidimensional array

 

 

package test;

 

import com.sap.mdm.commands.CommandException;

import com.sap.mdm.data.Record;

import com.sap.mdm.data.RecordResultSet;

import com.sap.mdm.data.commands.RetrieveLimitedRecordsCommand;

import com.sap.mdm.extension.data.ResultDefinitionEx;

import com.sap.mdm.ids.FieldId;

import com.sap.mdm.ids.TableId;

import com.sap.mdm.net.ConnectionException;

import com.sap.mdm.schema.FieldProperties;

import com.sap.mdm.schema.RepositorySchema;

import com.sap.mdm.schema.commands.GetRepositorySchemaCommand;

import com.sap.mdm.search.Search;

import com.sap.mdm.session.RepositorySessionContext;

import com.sap.mdm.session.SessionException;

import com.sap.mdm.session.SessionManager;

import com.sap.mdm.session.SessionTypes;

import com.sap.mdm.session.UserSessionContext;

 

public class Main {

 

                static public void main(String[] args) throws SessionException,

                                                ConnectionException, CommandException {

 

                                String mdsName = "100.10.100.100"; // IP address or name of the Master Data Server

                                String repositoryName = "Products"; // name of the repository

                                String regionName = "English [US]";

                                String userName = "Admin"; // enter a valid username

                                String password = ""; // enter the password for the above username

 

                                UserSessionContext context = new UserSessionContext(mdsName,

                                                                repositoryName, regionName, userName);

                                // Get an instance of the session manager

                                SessionManager sessionManager = SessionManager.getInstance();

                                // Create a user session

                                String ses = sessionManager.createSession(context,

                                                                SessionTypes.USER_SESSION_TYPE, password);

 

                                GetRepositorySchemaCommand grsc = new GetRepositorySchemaCommand(

                                                                (RepositorySessionContext) context);

                                grsc.setSession(ses);

                                grsc.execute();

                                RepositorySchema schema = grsc.getRepositorySchema();

                                String tname = "MDM_Products";

                                int j = 0;

                                FieldProperties fprop[] = schema.getFields(tname);

                                String selectFields[] = new String[fprop.length];

                                for (int i = 0; i < fprop.length; i++) {

                                                if (fprop[i].getType() <= 13) {

                                                                selectFields[j] = fprop[i].getCode();

                                                                j++;

                                                }

                                }

                                TableId mainTableId = schema.getTableId(tname);

                                ResultDefinitionEx resDef = new ResultDefinitionEx("Products", context);

                                FieldId fid[] = new FieldId[j];

                                for (int i = 0; i < j; i++) {

                                                fid[i] = schema.getFieldId(tname, selectFields[i]);

                                                resDef.addSelectField(selectFields[i]);

                                }

                                resDef.setLoadAttributes(true);

 

                                Search search = new Search(mainTableId);

                                RetrieveLimitedRecordsCommand command = new RetrieveLimitedRecordsCommand(

                                                                context);

                                command.setResultDefinition(resDef);

                                command.setSearch(search);

                                command.setSession(ses);

 

                                try {

                                                command.execute();

                                } catch (Exception e) {

                                                e.printStackTrace();

                                }

                                RecordResultSet recordResultSet = command.getRecords();

                                Record[] records = recordResultSet.getRecords();

                                String data[][] = new String[records.length][fid.length];

                                for (int i = 0; i < fid.length; i++) {

                                                System.out.println("the field is : " + selectFields[i]);

                                                for (int g = 0; g < records.length; g++) {

                                                                try {

                                                                                if (fprop[i].getType() >= 7 && fprop[i].getType() <= 10) {

                                                                                                try {

                                                                                                                data[g][i] = records[g].getLookupDisplayValue(

                                                                                                                                                fprop[i].getId()).toString();

                                                                                                } catch (NullPointerException e) {

                                                                                                                data[g][i] = "";

                                                                                                }

                                                                                } else

                                                                                                data[g][i] = records[g].getFieldValue(fprop[i].getId())

                                                                                                                                .toString();

                                                                                if (data[g][i].equals("[Null]") || data[g][i].length() > 15)

                                                                                                data[g][i] = "";

                                                                } catch (IllegalArgumentException e) {

                                                                                data[g][i] = "";

 

                                                                }

                                                                System.out.println(data[g][i]);

                                                }

                                }

 

                }

}

 

 

To be continued..

 

 

4       Related Content

 

http://www.sap.com/platform/netweaver/components/mdm/index.epx

http://help.sap.com/saphelp_mdm71/helpdata/en/13/041975d8ce4d4287d5205816ea955a/content.htm

http://help.sap.com/javadocs/MDM71/

SRM-MDM Catalog Implementation – End to End Procedure

$
0
0

SRM-MDM Catalog Implementation – End to End Procedure

 

 

Applies to:

Standard SAP MDM 7.1 Console, SAP SRM 7.0, SRM–MDM catalog 3.0 SP11, SRM-MDM catalog content 3.0, MDM SRM UI App. For more information, visit the Master Data Management homepage.


Summary

The main purpose of this article is to show how to integrate SRM-MDM Catalog. A basic step by step guide to implement SRM-MDM Catalog.

 

Author:        Nitish Sharma

Company:    Mahindra Satyam

Created on:  21 june 2012


Author Bio

1.png 

 

Nitish Sharma has been associated with Mahindra Satyam for 15 months and has been a part of MDM practice. He is skilled in SAP MDM, SAP MDG, SRM MDM, Java, J2EE and completed his Bachelor’s degree in Computer Science.

                                                                                                

 

 

 

Table of Contents

 

1) Background

2) Prerequisites

3) MDM configuration

            Creating a mask

            Assigning records to mask

4) UI configuration

5) SRM configuration

6) Conclusion

7) Related Content

8) Copyright

 

1) Background


The purpose of this document is to show connectivity for SRM-MDM Catalog, creation of catalogs and creation of OCI mappings. Relevant screenshots are provided that explain the whole end to end procedure to set up SRM-MDM Catalog content.

 

  2) Prerequisites

 

System Type

Service

Pack Purpose

SAP MDM 7.1 Console

7.1.07.263

To setup SRMMDM Catalog repository

SAP SRM 7.0

SP9

To setup Catalog ID & call structure

SRM–MDM catalog 3.0 SP11

SP11

Catalog Repository

SRM-MDM catalog content 3.0

 

SRM MDM catalog content deployment

MDM SRM UI App

 

To configure OCI structure & search engine

 

  3) MDM configuration


Creating a mask

 

25.png


Assigning records to mask

26.png

 

  4) UI configuration

 

Enter all the details to login

 

4.png

 

Every user is assigned to a repository, select the user as maintained in Webservice.

 

5.png

The Configuration view shows the user specific configuration of SRM MDM Catalog. In the General pane, you can set Shopping options, Item Display Criteria, Search, Shopping Lists and OCI Mapping.

 

6.png

7.png

 

The OCI Mapping is shown in below screenshot.

8.png

9.png

Here under Customize Display->item lists, we select the fields which we want to deal with on the portal. Just select field and click on add.

10.png

For example

11.png

 

Here under Customize Display->Cart preview, we can check the preview of previous step.

 

12.png

 

Here under Customize Display-> compare, we can compare the positions of fields of selected fields in previous step with SRM-MDM repository fields. On the right hand side using up and down button we can adjust the positions of fields.

 

13.png

Under customize search tab, you can specify that on which fields search can be performed, and what should be the conditions like word “starts with n or m” etc.

 

14.png

15.png

For example

 

16.png

17.png

 

 

5) SRM configuration

 

Go to transaction SPRO(it requires SRM side authorization first to log in SRM)

 

18.png

Click on SAP Refrence IMG

19.png

 

Click on SAP Implementation->SAP SRM-> SRM Server->Master Data->Content Management-> Define external web services.

20.png

 

Enter the catalog web service id and description for the mask created in SAP MDM

 

 

21.png

After creating webservice id go to Standard call structure to assign relevant parameters which include repository name (to which catalog belongs), catalog name (mask name).

22.png

 

After assigning all the parameters now we have to assign user id to particular catalog. Go to pposa_bbp tcode. Click on select attribute, then ‘+’ symbol to add user id and catalog

24.png

 

  6) Conclusion

With the implementation of above procedure, now we can able to access catalog (mask created in mdm) on the portal, the id created in the last step can only access assigned catalog. The fields which are selected in UI implementation step will be visible in portal. The name of catalog (webservice id) given in webservice under standard call structure will also be visible. So, assigned user can now access the catalog and can make a shopping cart with the items assigned to particular catalog.

 

7) Related Content

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/806f3d56-0d29-2d10-9abf-c0df6f0fdfe8?quicklink=index&overridelayout=true

http://help.sap.com/saphelp_srmmdm10/helpdata/en/44/ec6f42f6e341aae10000000a114a6b/frameset.htm

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/b0940876-42fe-2d10-77be-a82aaa163e13?QuickLink=index&overridelayout=true

http://213.41.80.15/SAP_ELearning/OKEC/nav/content/011000358700000229332007E.PDF

 

 

8) Copyright

 

© Copyright 2012 SAP AG. All rights reserved.

No part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP AG. The information contained herein may be changed without prior notice.

Some software products marketed by SAP AG and its distributors contain proprietary software components of other software vendors.

Microsoft, Windows, Excel, Outlook, and PowerPoint are registered trademarks of Microsoft Corporation.

IBM, DB2, DB2 Universal Database, System i, System i5, System p, System p5, System x, System z, System z10, System z9, z10, z9, iSeries, pSeries, xSeries, zSeries, eServer, z/VM, z/OS, i5/OS, S/390, OS/390, OS/400, AS/400, S/390 Parallel Enterprise Server, PowerVM, Power Architecture, POWER6+, POWER6, POWER5+, POWER5, POWER, OpenPower, PowerPC, BatchPipes, BladeCenter, System Storage, GPFS, HACMP, RETAIN, DB2 Connect, RACF, Redbooks, OS/2, Parallel Sysplex, MVS/ESA, AIX, Intelligent Miner, WebSphere, Netfinity, Tivoli and Informix are trademarks or registered trademarks of IBM Corporation.

Linux is the registered trademark of Linus Torvalds in the U.S. and other countries.

Adobe, the Adobe logo, Acrobat, PostScript, and Reader are either trademarks or registered trademarks of Adobe Systems Incorporated in the United States and/or other countries.

Oracle is a registered trademark of Oracle Corporation.

UNIX, X/Open, OSF/1, and Motif are registered trademarks of the Open Group.

Citrix, ICA, Program Neighborhood, MetaFrame, WinFrame, VideoFrame, and MultiWin are trademarks or registered trademarks of Citrix Systems, Inc.

HTML, XML, XHTML and W3C are trademarks or registered trademarks of W3C®, World Wide Web Consortium, Massachusetts Institute of Technology.

Java is a registered trademark of Oracle Corporation.

JavaScript is a registered trademark of Oracle Corporation, used under license for technology invented and implemented by Netscape.

SAP, R/3, SAP NetWeaver, Duet, PartnerEdge, ByDesign, SAP Business ByDesign, and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP AG in Germany and other countries.

Business Objects and the Business Objects logo, BusinessObjects, Crystal Reports, Crystal Decisions, Web Intelligence, Xcelsius, and other Business Objects products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of Business Objects S.A. in the United States and in other countries. Business Objects is an SAP company.

All other product and service names mentioned are the trademarks of their respective companies. Data contained in this document serves informational purposes only. National product specifications may vary.

These materials are subject to change without notice. These materials are provided by SAP AG and its affiliated companies ("SAP Group") for informational purposes only, without representation or warranty of any kind, and SAP Group shall not be liable for errors or omissions with respect to the materials. The only warranties for SAP Group products and services are those that are set forth in the express warranty statements accompanying such products and services, if any. Nothing herein should be construed as constituting an additional warranty.

MDM_CLNT_EXTR :ECC Configurations

$
0
0

Author: Ankush Bhardwaj

Company: Infosys Technologies Limited

Date written: 18/07/2012

Author Bio: Ankush Bhardwaj is currently working as an MDM Consultant in Infosys Technologies Limited

Target readers: Sap MDM consultant, SAP PI Consultants, SAP ECC Consultants and Data Migration Teams

Keywords: SAP MDM, SAP, Master Data Extraction, MDM_CLNT_EXTR

 

 

Summary:

This document caters to the Business Scenario where Master Date is residing in ECC system and is required to be distributed to various target systems like SAP MDM or SAP or Non-SAP systems. It uses SAP PI system as intermediate systems that will accept the data from Sender system and convert the data in required format for destination system and then sends data to target system. This Document talks about the Landscape Description of the Master Data Extraction of Material Master Data from the SAP ERP System to Target systems. This document will provide all the minute configuration details of the SAP ECC system. This document will only deal with the Extraction procedure of Material Master Data from Backend SAP ECC System.

 

 

 

Table Of Contents:

 

        Glossary

  1. Creating a New Logical System for Receiver PI System (Some old system can also be used if we have created one earlier)
  2. Creating the Logical System for Sender R3 System & Assigning the Client
  3. Configuration of RFC Destination Pointing to PI System
  4. Configuration of Communication Port (Transactional Port to be mapped with RFC Destination)
  5. Configuration of Partner Profile (For Outbound messages)
  6. Creation of ALE distribution Model/Customer Distribution Model

        References

 

 

 

Glossary

 

ALE - The technology for setting up and operating distributed applications.

Application Link Enabling (ALE) facilitates the distributed, but integrated, installation of SAP systems. This involves business-driven message exchange using consistent data across loosely linked SAP applications. Applications are integrated using synchronous and asynchronous communication - not by using a central database.

 

ALE consists of the following layers:

Application services

Distribution services

Communication services

 

Delta Mode - The Delta Mode is a specific distribution mode for an extraction run. A delta extraction run uses the system’s change pointers to extract and distribute only those master data objects that have been changed since the last extraction run. In delta mode, the extraction framework performs some mandatory checks. As the delta modes reuses the selection and transfer criteria of an initial extraction run for a master data object, it is mandatory that this object has at least been initially extracted once. Otherwise the object will be excluded from the delta extraction run. In addition, you cannot define selection and transfer criteria in the delta mode.

 

Extraction Object - An Extraction Object is a valid implementation of an object specific extractor. It consists of the extraction logic how to read the master data from the system storage according to the defined selection criteria and the system/object specific implementation of the communication technology used for sending the master data to the receiver system. As a common rule, the SAP delivered object extractors for R/3 & ERP systems use ALE IDoc technology, whereas object extractors for CRM and SRM systems use the ABAP Proxy XML Message technology.

 

Distribution Mode - The distribution mode defines how the data is extracted from the SAP system and sent to the target receiving system. The extraction framework is able to support multiple, different distribution modes as the mode to be used is part of the extraction variant object configuration. The actual implementation of a distribution mode has to be done specifically by the extraction object – this implementation is not part of the generic extraction framework. The framework supports implementing an Initial and a Delta Mode only, including the generic handling of selection and transfer criteria for these modes.

 

Extraction Run - An Extraction Run is the execution of a maintained Extraction Variant. It retrieves the selected master data objects from the system and sends it to the chosen target system. An Extraction Run can either be executed in Initial or Delta Mode.

 

Extraction Variant - An Extraction Variant is the prerequisite to start an Extraction Run. The Extraction Variant consists of a name and description, the extraction object, the selection and transfer criteria, the extraction mode, the target receiving system and the resulting message size. It is the configuration of the master data extraction. Extraction Variants can be stored in the system.

 

 

Initial Mode - The Initial Mode is a specific distribution mode for an extraction run. Initial Mode executes the first, complete extraction and distribution of the chosen master data object. The objects are extracted from the system’s storage according to the given selection criteria, independent from change pointers. The scope of the distributed object is sized according to the given transfer criteria. An extraction run in initial mode is a prerequisite for extraction runs in delta mode.

 

 

 

Steps of ALE Configurations:

 

1. Creating a New Logical System for Receiver PI System


For the ALE Configuration the first step is the creation of a Logical System for Receiver PI system.

 

Configuration Steps:

  • Go to Transaction SALE/BD54: Create New Logical System for receiver PI system.

 

                             


Figure 1: Initial Screen for t-code “SALE”

 

                             

 

  • Click New Entries on below screen to create new logical system.
  • Press SAVE button to save new creation.

 

                        

Figure 2: Screen to create new Logical Systems

 

 

  • New Logical System created.

 

                        

 

 

 

 

 

2. Creating the Logical System for Sender R3 System & Assigning the Client

 

Configuration Steps:

Go to Transaction SCC4: Linking Sender System to Client

 

  1. BASIS team generally has access to SCC4 as it is administration related tCode.
  2. Logical System name for the given R/3 or ECC system needs to be checked and link to required Client if necessary.
  3. This Logical System name should be used as Sender ECC LS.

 

 

 

 

 

3.Configuration of RFC Destination Pointing to PI System


A RFC Destination is to be created with the exact same name as the LS name for the Receiver PI System. This RFC Destination will point to the receiver PI System which would be used in the integration Landscape.

 

Configuration Steps:

 

  • T-Code is SM59/SALE.

                                                                           

 

Figure 3: Initial Screen of SM59 for RFC Configurations


 

  • Click on ABAP Connections and press Create icon. Below screen pops up:

                                                                         

 

Figure 4: Initial Screen to create RFC Destination


 

  • Enter values as shown below:

 

                                                      

 

Figure 5: Technical Settings for RFC Destination


 

  1. Target Host could be entered as IP Address. Once you save it, it will appear as name automatically.
  2. RFCDES is the table that was referred for entering IP address and System.

 

                                                               

 

Figure 6: Logon & Security Settings for RFC Destination


 

  1. This user should be some existing user of this Destination System and not the current system that is being used now. For other tabs, default settings will be used.
  2. Save RFC Destination.

 

 

 

 

4. Configuration of Communication Port (Transactional Port to be mapped with RFC Destination)

 

A Transactional RFC Communication Port is to be created which is used for R/3 systems. Then the port needs to be mapped with the RFC Destination.

 

Configuration Steps:


  • T-Code is WE21:

 

Figure 7: Initial Screen for creating Port


 

  • Click on Transactional RFC and press Create icon. Give name of Port and press ‘Enter’.

 

 

  • Fill in required fields and save the Port.

 

 

Figure 8: Technical Settings for Port

 

 

 

 

 

5. Configuration of Partner Profile (For Outbound messages)


Partner Profile needs to be configured for Outbound Message types for Material Master that is MATMAS05 and MDMRECEIPT.

 

Configuration Steps:

 

  • T-Code is WE20: Select Partner Type LS and press Create icon.

 

 

Figure 9: Initial Screen for Creating Partner Profiles

 

  • Enter Details as below:

 

Figure 10: Outbound Parameters and other settings

 


  • For maintaining Outbound Parameters, press highlighted icon and below screen will appear:
  • Enter data as shown in the screenshot:

 

 

Figure 11: Screen for Outbound Parameter settings of Partner Profile

 

 

  • Similarly add MDMRECEIPT Message Type as well in the Partner Profile Outbound Parameters.

 

  • MDMRECEIPTis an old message type that was used in MDM 2.0 and MDM 3.0. As T-Code MDM_CLNT_EXTRstill has to support the old MDM releases, so we have to configure this IDoc type in ALE, although it is not used in standard MDM 5.5 and 7.1 scenarios. In case we do not configure MDMRECEIPT, then extraction can fail with below error message:

        “BI 003 – Could not determine recipients for message type MDMRECEIPT”.

 

 

 

 

 

6. Creation of ALE distribution Model/Customer Distribution Model


This step needs to be performed for creating communication channel from Sender system to receiver system.


A customer distribution model needs to be created with a Model Name & then the MATMAS & MDMRECEIPT Message Type should be added there under the Sender PI System & Receiver R3 System.

 

Configuration Steps:

 

  • T-Code is BD64: Go to Change Mode and Click “Create Model View”

 

 

Figure 12: Initial Screen for Creating Distribution Model view

 


Figure 13: Pop-up screen for Creating Distribution Model

 

  • Select Model View and Click “Add Message Type”

Figure 14: Pop-up screen for Assigning Message Types to Model view

 

  • Similarly add MDMRECEIPT Message Type as well.
  • After all these steps, it will appear like this in Model View:

Figure 15: Complete Distribution Model

 

 

  • Generate Partner Profile for this Distribution Model:

Figure 16: Path for Generating Partner Profile for Distribution Model

 

  • Below Screen appears then click Execute:

Figure 17: Selection screen for generation of Partner Profile

 

  • Log will be displayed as shown below:

 

 

  • Save Model View and Then Distribute the Model to Target System:

 

Figure 18: Path for distributing model to target system

  • Select Destination System
  • Log is displayed as follows:

 

 

 

 

 

 

7. Create Variant for Initial Extraction(Selection Criteria and Transfer segments are defined)


The Master Extractor needs to be configured by setting the variants & using the LS name of the ALE settings done above.

The Variant Name is according to Material Master Data & a short description is also given. The Extraction Object is given according to the Message type of the Master which is to be exported. In the Target system the Logical System Name of the Receiver PI system is given. The Mode is kept Initial if it is a first time Extraction. The Block size is selected on the basis of how many records are to be bundled together in one IDOC.


Customizing Steps:

 

 

  1. T-Code is MDM_CLNT_EXTR:

 

  1. Enter name of Variant, Description.
  2. Material_Extract is the Extraction Object for extracting data for Materials.
  3. Target System would be Receiver PI System.
  4. Distribution Mode decides if you want to extract Initial Load Data or Delta Extraction.
  5. Block size decides the number of records to be combined in single MATMAS Idoc.
  6. After entering these values, Press Enter.

 

Figure 20: Initial Screen for MDM_CLNT_EXTR


 

  1. Field Groups will appear. Here you can decide Selection Criteria as well as Transfer Criteria:

 

Figure 21: Screen for entering Selection criteria and transfer selection

 

 

 

  • Double Click on this selection segment E1MARAM and below screen will appear:

 

 

  • Mention Selection Criteria for Extraction Program.

 

Note:In case you are entering some value range for Numeric Type, then you need to enter the value including Leading Zeroes for the concerned field. If this is not done, then extraction will start but message will be generated saying “No Data Found”. For Example, we need to maintain Material Number – MATNR as 000000000070000000 (total 18characters) and not only 70000000 (8 characters).

 

  • By selecting the checkbox under heading “Transfer” will ensure that given segment is included during the extraction and idoc creation.

 

  • Then SAVE variant and click “Start Extraction”.
  • Background Job would be scheduled to extract data.

 

         

 

  • Click on “Display Jobs” and all related jobs will be displayed at the screen:

 

Figure 22: Batch Jobs executed for MDM_CLNT_EXTR


 

  • Select Job and see Job Log and it would appear like this:

 

Figure 23: Log display for Batch Job


 

 

 

 

 

 

 

 

  • We can go to WE02 and see idocs posted after Extraction.

 

Figure 24: Idoc display for extracted idocs

 

Software Check:


 

 

 

 

 

 

 

 

 

 

 

 

 

  • We need to ensure that Extraction Framework and Extraction Object Software Packages are installed in the system. Without which Extraction will not happen.

 

 

  • With this, all the configuration settings needed to be done for extraction of Material Master are complete. Further Settings in PI and MDM system needs to be done to automate the process completely.

 

 

 

References

Working with Tuples using MDM JAVA API

$
0
0

Introduction

This document describes how to work with tuple in MDM.

A tuple is a hierarchical structure in MDM. It can’t exist by itself, it has a parent table/tuple. It can be a nested structure with one to many relation at every level. e.g A vendor can have multiple contact information.

So contact information (address, phone number, etc) can be grouped under a tuple which will be part of vendor table.

For modifying/creating tuple data, you need to first get the parent record in main table which holds the tuple.

 

 

Creating new record with Tuple:

Folllowing is the sample code to create new record in mdm with multiple tuplevalues.

TupleDefinitionId definitionId = reposSchema.getTupleDefinition(tupleName).getId();   
TupleDefinitionSchema tupleDefinitionSchema = reposSchema.getTupleDefinitionSchema(definitionId);   
FieldProperties[]  tupleFieldProps=tupleDefinitionSchema.getFields();     
CreateRecordsCommand createRecordsCommand=new CreateRecordsCommand(userCtx);        
//Create one array of record instance with the number of records you want to insert in MDM table   
Record[]records = new Record[collection.size()];     
//loop on your input collection and call following method to set tuple values for each record     
Record record = RecordFactory.createEmptyRecord(tableId);   
records[i]=record;   
MultiTupleValue multiTupleValue = new MultiTupleValue();                        
TupleValue tupleVal= MdmValueFactory.createTupleValue();   
//Now loop on your tuple value set and get value for each field in tuple   
//set it into the tuplevalue instance   
for(int k=0;k<tupleFieldProps.length;k++)   
{         String tupleFieldName=tupleFieldProps[k].getCode();   
// here we have the new values stored in one object.      String tupleValue = obj.getFieldValue(tupleFieldName);         if(tupleValue!=null){               String fieldType =tupleFieldProps[k].getTypeName().toString();         
if (fieldType.equals("Text"))    
{         tupleVal.setFieldValue(tupleFieldProps[k]         .getId(), new StringValue(tupleValue));     
}   
elseif (fieldType.equals("Real"))    
{                                                       FloatValue fValue = new FloatValue(Float                                         .parseFloat(tupleValue));                tupleVal.setFieldValue(tupleFieldProps[k]                                            .getId(), fValue);                                                               
}   
elseif (fieldType.equals("Integer")) {                                       if(tupleValue!=null&&!"".equals(tupleValue.toString()))         {              IntegerValue intvalue = new     IntegerValue(Integer.parseInt(tupleValue));              tupleVal.setFieldValue(tupleFieldProps[k].getId(), intvalue);         }      }     elseif (fieldType.equals("Lookup [Flat]")) {             tupleVal.setFieldValue(tupleFieldProps[k]                                   .getId(), new LookupValue(new RecordId(                                                             tupleValue)));         }     elseif(((tupleFieldProps[k].getTypeName()).toString()).equals("Literal Date")){              try {                   SimpleDateFormat df = new SimpleDateFormat("MM/dd/yyyy");                    String effectiveDate = tupleValue;                   Calendar cal=Calendar.getInstance();                   cal.setTime(df.parse(effectiveDate));                   tupleVal.setFieldValue(tupleFieldProps[k].getId(), new DateTimeValue(cal));                                                            }    
catch (Exception e)    
{                     e.printStackTrace();                                                                  }               }                              }               }       
//Add Tuple value to the multiple tuple value instance        multiTupleValue.addValue(tupleVal);   
//End of loop for tuple value set   
//Now set multiple tuple value instance to the tuple field of the record             record.setFieldValue(tableSchema.getFieldId(tupleName), multiTupleValue);   
//End of loop for main table collection   
//At last set records to the command instance and execute       createRecordsCommand.setTableId(tableId);      createRecordsCommand.setRecords(records);      createRecordsCommand.execute();  

 

Modifying Record with tuple:

For modifying existing records, you need to first fetch old records from table. Then for each record, you can get the tuplevalue instance which can be upadated the same way as we did in create method.Only the command type will change

 

MultiTupleValue multiTupleValue = (MultiTupleValue) record.getFieldValue(          tableSchema.getFieldId(tupleName));   
//For new records it was like following: MultiTupleValue multiTupleValue = new MultiTupleValue();               //Get all tuple values           MdmValue[] tuplevals=multiTupleValue.getValues();           //Now to update a particular row in a tuple value set, get  its instance from the array            TupleValue tupleVal=(TupleValue)tuplevals[ ind]                        
//Rest of the code will be same for setting values to the the tuplevalue     
//if you want to add a new tuple value to the value set create the tuple value instance as below        TupleValue tupleVal= MdmValueFactory.createTupleValue();     
//For new row in tuple value set you will have to add the instance to multituplevalue instance.          multiTupleValue.addValue(tupleVal);     
//for modifying existing tuple value only you need to update in the tuplevalue instance.       
//At last set records to the command                          TableId tableId=reposSchema.getTableId(tablename);                     modifyRecordsCommand.setRecords(updateRecs);                     modifyRecordsCommand.setTableId(tableId);                     modifyRecordsCommand.execute();  

 

Deleting Tuple Value:

For deleting tuple values also, you will need to get old record instance.

MultiTupleValue multiTupleValue = (MultiTupleValue)record.getFieldValue(   
tableSchema.getFieldId(tupleName));     
//Get old tuple value instances   
TupleValue[] tuplevals=multiTupleValue.getUntouchTupleRecords();       
//Delete old Tuple values          int vals_count=tuplevals.length;           for(int j=vals_count-1;j>=0;j--)                     {                               multiTupleValue.removeTupleValue(((TupleValue)tuplevals[j]).getTupleRecordId());                          }       

 

 

Nested Tuple:

Working with nested tuple is similar as above. Only for the child tuple, you need to get the tuple instance from its parent tuple.

MultiTupleValue childTupleValue=((MultiTupleValue)outerTupleVal.
getFieldValue(tupleFieldProps[k].getId()));
MdmValue[] childTupleVals= childTupleValue.getUntouchTupleRecords();
//Remove tuple value
childTupleValue.removeTupleValue(((TupleValue) childTupleVals[j]).getTupleRecordId());
//Get the Field properties of the child tuple to set field values for the //child tuple
TupleDefinitionId definitionId2 = reposSchema.getTupleDefinition("ChildTupleName").getId();;
TupleDefinitionSchema tupleDefinitionSchema2 = reposSchema.
 getTupleDefinitionSchema(definitionId2);
FieldProperties[] fieldProperties2=tupleDefinitionSchema2.getFields();   
//Add new tuple value
TupleValue  newValue=MdmValueFactory.createTupleValue();
//set field values to the tuple instance
childTupleVals.addValue(newValue);
//For modifying existing get the tuple value instance, set the field value 
 //by looping on fieldproperties.

No more Java connections can be set up to MDS

$
0
0

Purpose:

This article talks about how to handle a connectivity problem from a front-end Java EE application to MDS.

 

 

Problem Overview:

You use CRM E-commerce, SRM MDM Catalog, MDM EC, or other Java EE applications, for which MDM Connector is used to set up the physical connection to MDS.

When you access any Java link, you get an error message saying:” HTTP 500, connection timed out”, or “page cannot be loaded”, although you have the backend repository up and running, and the same configuration has been working ok.

In the default trace, you find an error like:

Cannot create JCA connection. Cause exception: Cannot get connection for 120 seconds. Possible reasons: 1) Connections are cached within SystemThread(can be any server service or any code invoked within  SystemThread in the SAP J2EE Engine) or they are allocated from a non transactional ConnectionFactory. In case of transactional ConnectionFactory used from Application Thread there is an automatic mechanism which detects unclosed connections, 2)The pool size of adapter "MDM Factory" is not enough according to the current load of the system. In case 1) the solution is to check for cached connections using the Connector Service list_conns command or in case 2), to increase the size of the pool.

 

 

Solution:

As a system administration task, you can log into VA or NWA (depending on the java version) checking the current connection related parameters and decide if there is indeed a connection leakage problem:

UsedManagedConnectionsCount - Returns the count of currently used managed connections.

FreeManagedConnectionsCount - The count of all available managed connections.

ManagedConnectionsUsageRate - Ratio(percent) between the used and maximum managed connections number for the connection pool of a specific connection factory.

MaxManagedConnectionsNumber - Connection pool maximum size.

WaitingForManagedConnectionCount - The count of clients waiting for a managed connection.

OpenedManagedConnectionsCount - Shows how many managed connections are opened per application (Table).

TimedOutConnectionRequests - The aggregated number of failed getConnection() requests, due to a heavy connection pool usage.

SAP Note 1822443 talks about how to get these values.

 

To troubleshoot for possible solutions, follow the steps below:

1, SAP Note 1821867 is released for a known problem on accumulation of connections in java connection pool. The used connections are not released, connection count reaches the maximum value (default 1000), and no new connection can be set up. You may first check if this note applies.

2, if step 1 doesn’t solve the problem, then increase the value of parameters:

Maximum Waiting Time for Connections = 120sec (default) << the waiting time for connection open

Maximum Connections = 1000(default) <<< the pool size

for resource adapter "MDM Factory".

1.png

3, if you still have this issue after increasing these values, you can use the “switch_debug_mode” and “list_conns” telnet commands. Run “list_conns” at a regular basis, so that you can find what is leaking the connections before the pool gets exhausted.  More details are documented in SAP Note 719778.

 

 

Related Notes:

1822443 - Connection pool parameters on different Net Weaver versions

1821867 - Accumulation of MDM connections in connection pool

719778 – DataSource fails to return connection


SAP MDM: Very useful links for reference

$
0
0

Hi,

 

If you are a beginner, please go through the below links where you can find information on SAP MDM.

 

SAP MDM Data console reference guide can be downloaded here:

http://help.sap.com/saphelp_nwmdm71/helpdata/en/4b/71608566ae3260e10000000a42189b/MDMConsole71.pdf

 

SAP MDM Data manager reference guide can be downloaded here:

http://help.sap.com/saphelp_nwmdm71/helpdata/en/4b/72b8aaa42301bae10000000a42189b/MDMDataManager71.pdf

 

SAP MDM Data import manager reference guide can be downloaded here:

http://help.sap.com/saphelp_nwmdm71/helpdata/en/4b/72b8e7a42301bae10000000a42189b/MDMImportManager71.pdf

 

SAP MDM Syndicator reference guide can be downloaded here:

http://help.sap.com/saphelp_nwmdm71/helpdata/en/4b/711def8a722593e10000000a42189b/MDMSyndicator71.pdf

 

SAP MDM Publisher reference guide can be downloaded here:

http://help.sap.com/saphelp_nwmdm71/helpdata/en/4d/c1dd583dd56d4be10000000a42189e/MDMPublisher71.pdf

 

SAP SCN community in below link:

http://scn.sap.com/community/mdm

 

One point of all the SAP reference guides can be found in below link:

http://help.sap.com/nwmdm#section5

 

SAP wiki page link is below (very useful to go through this link for beginners):

http://wiki.scn.sap.com/wiki/display/SAPMDM/SAP+Master+Data+Management

 

FAQ’s on MDM:

http://wiki.scn.sap.com/wiki/display/SAPMDM/FAQ+about+SAP+NetWeaver+MDM

 

I will update this page when ever i come across some useful links.

Hope this document is useful.

 

Regards,

Revanth

SRM MDM Catalog Document.

$
0
0

Application Operations Guide

SAP Supplier Relationship Catalog Management powered by SAP NetWeaver.

 

 

 

 

Target Audience

System Administrators

Technology Consultants

 

 

 

Applies to:

 

Standard SRMMDMCATALOG20SP4 Repository, SAP NetWeaver MDM Console 7.1, Import Manager 7.1, Data Manager 7.1, SRM-MDM catalog content, MDM SRM UI App, SRM 7.0, SAP NetWeaver Portal For more information, visit the Master Data Management homepage.

 

Author: Girish B S

 

Company: Tech Mahindra

 

Created on: 25th September 2013

 

Table of Contents

 

1.      Introduction…………………………………………………………………………………………………………….3

 

1.1  About this Document………………………………………………………………………………………..3

1.2  Integration………………………………………………………………………………………………………..3

2.      SRM-MDM Catalog Implementation – End to End Procedure………………………………….4

2.1  MDM configuration……………………………………………………………………………………………4

2.2  SRM configuration…………………………………………………………………………………………….6

2.3  Organization Structure Set Up………………………………………………………………………….10

3.      UI configuration………………………………………………………………………………………………………12

  1. 3.1 The OCI Mapping…………………………………………………………………………………………………………15
  2. 3.2 Customize Display Tab…………………………………………………………………………………………………16
  3. 3.3 Customize Search Tab………………………………………………………………………………………………….17

 

4.      Portal View……………………………………………………………………………………………………………..18

5.      Related Contents…………………………………………………………………………………………………….19

 

  

 

1.      Introduction

 

 

  1. 1.1 About this Document

 

Purpose

 

The SRM-MDM Catalog business scenario provides a solution for creating, maintaining, and managing catalog content in your e-procurement application.

 

This guide is intended for technology consultants and system administrators. It describes the planning and preparations required for SRM MDM Catalog Operations.

 

  1. 1.2 Integration

 

You can use the SRM-MDM Catalog with SAP SRM (release 7.0 and higher) and SAP enhancement package 4 for SAP ERP 6.0.

 

Components

Required/Optional

SAP NetWeaver-MDM Server

Required

SAP NetWeaver - MDM Import Server

Required

SRM-MDM Catalog Business Content, that is the SRM-MDM Catalog Repository

Required

SRM-MDM Catalog UI (Java Web Dynpro)

Required

SAP NetWeaver-EP Server

Required

 

 

1.      SRM-MDM Catalog Implementation – End to End Procedure

SRM-MDM Catalog Implementation – End to End Procedure

  2.1 MDM configuration

Creating a mask

Navigate to the Mask table as shown below under Records pane create a mask and enter its details below.

 

Creating a mask.JPG

Assigning records to mask

Assigning records to mask.JPG

Once the mask gets created navigate to the Main table and select the records which are to be shown to a particular user in Catalog which we showcase in Portal.

As we have Named Search also which is dynamic compared to the Mask, therefore even we can go ahead and create a named search and assign the records and the same can be used under Webservices which is explained in later part.

 

1.2  SRM configuration

  Web services to be defined to call the Catalog Repository

Go to transaction SPRO (it requires SRM side authorization first to log in SRM)

Click on SAP Reference IMG

3.JPG

Click on SAP Implementation->SAP SRM-> SRM Server->Master Data->Content Management-> Define external web services.

4.JPG

Enter the Web Services ID and its Description

 

5.JPG

After creating webservice id go to Standard call structure to assign relevant parameters which include repository name (to which catalog belongs), catalog name (mask name).

6.JPG


Parameter

Name

 

Parameter Value

 

Type

 

Remark

 

 

Use the following syntax:

http://<J2EEserver:J2EEport>/SR

MMDM/SRM_MDM

 

URL

 

J2EEserver> is the host on which the J2EE server is installed <J2EEport>is the http port of the J2EE server.

 

username

 

<repository_user_name> Refers to the user in SAP MDM (not the end user starting the catalog search in SRM).

 

Fixed

Value

 

  1. Required... The value can be that of any user maintained for the catalog repository in SAP MDM Console under Admin → Users.

 

password

 

<repository_password>

 

Fixed

Value

 

Optional… If you do not specify a named search, all users have the same unrestricted access to the catalog.

 

server

 

<MDM_server>

 

Fixed

Value

 

  1. Required...Host on which your MDM server is installed.

 

uilanguage

 

SY-LANGU

 

SAP Field

 

Required… The system uses the logon language (the initial release Supports German/English/French)

 

mask

 

<mask_name>

 

Fixed Value

 

If you do not specify a mask, all users have the same unrestricted access to the catalog.

 

HOOK_URL

 

 

Return URL

 

Required

 

 

1.2  Organization Structure Set Up-

Once the web service up and running the Web services id which we created has to be assigned to a particular User ID (Where User has to be available under SRM, use Users_Gen to create the same.

Navigate to PPOCA_BBP T code (SRM consultants will do it for you). To create organizational Structure and assign Position and followed by a Central Person.

7.JPG

Assign the user with the Catalog ID (Webservices) which is already created.

 

8.JPG

1.      UI configuration

Use the link which is a combination of Java Server and the Port and enter all the details to login.

Note- Users which we created under Console with the role UI Configuration Manager are only to be used in order to Log In.

 

9.JPG

10.JPG

Once you log in you can view the list of Users, created under Console with the Role Catalog User available in the dropdown.

 

11.JPG

Select a Particular User to whom you need to save and perform the OCI Mapping and proceed.

 

12.JPG

The OCI Mapping.

 

13.JPG

Customize Display Tab

 

You can customize the available fields for the following screens: Item Lists, Shopping Cart

14.JPG

Customize Search Tab

  1.   Select your preference, Yes or No for the configuration options:

  Show Hierarchy and Selection Search in Search Tab

             Show Hierarchy and Selection Search in Advance Search Tab

2.      In the dropdown box Hierarchy, choose system search field type. There are two options:

Hierarchy; Category (taxonomy)

15.JPG

1.       Portal View

Once the Necessary Configuration and all set up done both at SRM Server and SAP NW the Portal will give a view for the Catalog Repository which we have designed and called in Webservices.

The SRM portal user can view and perform the Shopping activities.

 

16.JPG

1.      Related Content

SRM-MDM Catalog Implementation – End to E... | SCN

Master Data Management homepage.

http://scn.sap.com/thread/3425616

 

Finally would like to thank SAP NW MDM Competency, Basis and Local SRM team for your continuous support in developing the Document.

Last but not the least would like to thank Nitish Sharma, which is a popular content under SCN, which gave lot of input in designing the document.

 

 

Regards,

Girish

How to Create a Repository in SAP MDM

$
0
0

HI All,

 

I have create this document to explain the process of creating a Repository in SAP MDM. I thought to share my knowledge once i gown-throw discussion How to add repository in MDM data manger.

 

How to Create a repository in SAP MDM

  1. Install MDM server ( MDS,MDIS,MDSS,MDLS) and DBMS server ( Database )
  2. Now start the MDM server and Auxiliary servers (MDIS, MDSS and MDLS) by as mentioned in the below steps.
  3. Control Panel\System and Security\Administrative Tools>Services>MDM server 7.1
  4. Follow the above step to start the auxiliary servers also.
  5. Now install MDM clients ( console, Data Manager, Import Manager and Syndicator)
  6. After installing the clients now open MDM Console.
  7. Now Mount the MDM Server in console.
  8. Note the server has to be in running status.
  9. As the server id now in running status, this doesn’t mean you can create repository in MDM server, to this you need to configure the DBMS setting.
  10. Right click on the server and choose DBMS setting and gives the appropriate DBMS server name and password of DBMS server.
  11. The above step is about to establish a connection between MDM server with DBMS server.
  12. Now Right click on the MDM server and choose the option create repository.
  13. Here you need to give the DBMS server name and password again, until the Repository name field and port field will be disabled
  14. Once the DBMS name and password is verified the repository name & port number will be enabled.
  15. Now you can type preferred repository name & port number and by this you will get a new repository in SAP MDM.
  16. Once the repository gets displayed under the MDM server hierarchy node, by default it will be in stopped status.
  17. Now you can design your repository as per requirement and once you feel the repository is ready for your use, then right click on the repository and choose start repository.
  18. You will get two options Immediate and Update Indices.
  19. It is good if you choose Update Indices.
  20. Now the repository will be reflected as per the designed in other clients.
  21. If you need to again change or to update structure of the Repository, then got to console down the repository and perform the action.
  22. Stopping a repository is not necessary for any type of addition in Admin node (Users, Roles, Ports, Remote system, XML schema, Links, etc.).

However most of the above steps cover’s some basis activities, I just added for knowledge sharing.

Installation of SAP MDM system (Master Data Management)

$
0
0

!!Installation of MDM system!!

 

Before starting with MDM installation, you should be familiar with few of the terminology as below.

 

MDM Component Knowledge

There are following components/instance are base of MDM server. There will be installed during installation from SAPinst

· Master Data Server (MDS)

· Master Data Import Server (MDIS)

· Master Data Syndication Server (MDSS)

· Master Data Layout Server (MDLS)

 

Tool to connect to MDM servers:

Commonly used tools are (these comes along with installation)

  • MDM Console
  • MDM CLIX

 

MDM Landscape Types:

Central System

Distributed system

 

Good Point : Multiple Components on One Database

MDS can be installed in an MCOD environment, sharing the same database with other SAP components,

such as SAP ERP ECC or SAP NetWeaver Application Server.

 

 

Windows Domain OR Local Domain:

Domain Installation

In a domain installation, the user account information is stored centrally in one database on the domain

controller and is accessible to all hosts in the system.

You must perform a domain installation if you install a distributed system with MDM servers and the

database on different hosts (this is strongly recommended to avoid authorization problems).

 

Local Installation

In a local installation, all Windows account information is stored locally on one host and is not visible to any other hosts in the system.

To run the MDM servers and the database on a single machine, perform a local installation.

 

 

Hardware Requirements           

Minimum disk space: SAP system files – 2 GB

Minimum RAM – 5 GB

Minimum Paging File - 15 GB

 

 

Typical or Custom mode

SAPinst asks whether to run the installation in Typical or Custom mode. If you choose Typical, SAPinst

provides automatic default settings and you only respond to a minimum number of prompts. However, you

can still change any of the default settings on the parameter summary screen.

The following tables list the basic system parameters that you must specify before installing your SAP

system in typical and in custom mode.

 

 

Create the New SAP System Users <sapsid>adm and SAPService<SAPSID>

  1. 1. In Active Directory Users and Computers Console, right-click Users in Tree and choose New User.
  2. 2. Enter the following:

Field                 Input for <sapsid>adm              Input for SAPService<SAPSID>

First name:        None                                        None

Initials:              None                                        None

Last name:        None                                        None

Full name:         <sapsid>adm                            SAPService<SAPSID>

User logon name: <sapsid>adm                        SAPService<SAPSID>

 

 

 

<<DB installation is a separate task. Please install DB be separately >>

 

Installing and Configuring the Database

 

Starting from MDM 7.1 SP08, the installation for Windows asks you which database server[s] will be used by MDM.

 

Note the following when planning the MDS and database combination:

· Make sure that your operating system meets the prerequisites for the database you want to run,

including any necessary service packs, hot fixes, libraries, or dlls. For more information, see the OS

and DB documentation.

 

· DBMS client software must be installed on the machine that runs the MDS. Make sure that the

MDM user SAPService<SID> has the necessary rights for accessing the database client.

 

· The DBMS user that MDM uses to connect to a DBMS must have access rights equivalent to the

system user. You can use the system account or create a user with equivalent rights. The built-in

system accounts are as follows:

- SQL Server – sa (reserved)

- Oracle – system (reserved)

- DB2 – db2admin (configurable installation option)

- MaxDB - dbm (configurable installation option)

 

!!  Start with Installation  !!

Installation of MDM 7.1 Software will take you in support pack level 8.

It means, if you want to install SAP MDM 7.1 with SP next release then you will have to install SAP MDM 7.1 SP 8 + upgrade system to next level.

 

Download software for SAP Net-weaver MDM 7.1 Software

 

Installations and Upgrades – M -- " SAP MDM" -- SAP NETWEAVER MDM 7.1

 

1.PNG

 

1.PNG

 

Now Extract the files à Just click on first EXE file. It will extract all remaining file and will create one folder like “51040422” and will keep all the files here.

 

1.PNG

 

SAPINST:  Now you will require SAPINST to start installation

 

Go to below location and extract the file “MDMIM71008_0.ZIP

 

MDM\51040422\DATA_UNITS\NW_MDM_71_SP08\MDM INSTALLATION MASTER 7.1\WIN32

 

You can find the SAPinst file in below location

 

\MDM\51040422\DATA_UNITS\NW_MDM_71_SP08\MDM INSTALLATION MASTER 7.1\WIN32\MDMIM71008_0\MDM_IM_WINDOWS_I386

 

1.PNG

 

Double Click on sapinst.exe to start with installation of SAP MDM 7.1 SP8

 

1.PNG

 

Considering that we are keeping all MDM server instances in single host, we will select Central System Option in SAPInst.

n Master Data Server (MDS)

n Master Data Import Server (MDIS)

n Master Data Syndication Server (MDSS)

                   Select  SAP MDM installation – Central System – Central System

1.PNG

 

 

Select Custom so that you can do configuration as per your requirement.

 

 

1.PNG

 

Enter SID you want to keep (3 Letters only)

And the Destination where you want to install MDM.

 

1.PNG

 

Master Password: It is used for all the users that are created.

 

1.PNG

 

Select below options, as per you requirement. If you want to install the system for personal use, go with 1st option else would recommend using Domain Installation so that there shouldn’t be authorization Problem.

 

1.PNG

 

 

Assuming that SIDADM and SAPServiceSID password have been created in advance hence provide the password of Service user. (Else Create them 1st from SAPINST itself)

 

1.PNG

 

Below location will be filled itself else you can find in below location:

 

\51040422\DATA_UNITS\NW_MDM_71_SP08\MDM INSTALLATION MASTER 7.1\NT_X64\MDMIM71008_0\MDM_IM_WINDOWS_X86_64\COMMON\INSTALL

 

It is a used to check the HOST related information, whether host is fulfilling the entire prerequisite to install application.

 

1.PNG

 

Host Agent and SAPOSCAL Log Directory information

 

1.PNG

 

SAPADM: it is used for Host Agent and SAPOSCAL to run.

We will select local else it will create problem if you select Domain of current user.

(if you select Domain then SAPADM will be one user in DOMAIN which will create problem if you want to install other SAP Application)

 

1.PNG

 

1.PNG

 

Here you will have to provide parameters for MDS (Master Data Server) instance

Instance No :  ?

 

 

1.PNG

 

MDS Port No. : 59950 (It is a default one used for MDS port)

MDS SSL Port no. : 59951 (It is a default one used for MDS SSL port)

 

 

1.PNG

 

Deselect - “Set up SSL in MDM Server.

Here I have not setup the SSL because it was not required in my project. However you can setup it later on as well, there is no issue with that.

 

1.PNG

 

Good point: you can use MDM system with any on the Database. Not only with one, You can select more than 1 database.

It means you can use repository from Oracle and SQL simultaneously.

 

 

1.PNG

 

Here you will have to provide parameters for MDIS (Master Data import Server) instance

Instance No :  ?

 

1.PNG

 

MDS Port No. : 59750 (It is a default one used for MDIS port)

MDS SSL Port no. : 59751 (It is a default one used for MDIS SSL port)

 

 

1.PNG

 

Here you will have to provide parameters for MDSS (Master Data Syndication Server) instance

Instance No :  ?

 

22.PNG

 

MDS Port No. : 59850 (It is a default one used for MDSS port)

MDS SSL Port no. : 59851 (It is a default one used for MDSS SSL port)

 

23.PNG

 

Select the software from the respective location for MDS/MDIS/MDSS/Shared Server Content

 

24.PNG

 

<<You should have SAPCRYPTOLIB for this step>>

Browse our Download Catalog

SAP Cryptographic Software"-- SAPCRYPTOLIB" -- SAPCRYPTOLIB

Select your OS and download SAPCRYPTOLIB file.

Uncar it and select it from SAPINST tool.

 

25.PNG

 

Below Screen tells about selected Archives to be unpacked at given location.

 

26.PNG

 

 

You will find below screen to go further –

 

27.PNG

 

Below is the summary of all the selection we have made.

 

28.PNG

 

Once you press Next, INSTALLATION will be started as below –

 

29.PNG

 

 

30.PNG

 

 

!!Congratulation!!

 

MDM server has been installed.

 

SAP MDM CONSOLE will be in your Desktop to work on MDM server.

 

31.PNG

 

 

Once you are able to connect to MDM server first time. You need to make the DB connectivity for MDM server

 

[If you add an entry for the DBMS server then the connection between DBMS and MDM Server would be made and subsequently the repository data would be stored in DBMS.]

 

32.PNG

 

 

33.PNG

 

DBMS Server Name: Provide the Server Name where DB is installed.

User Name: You should have one user in DB which will allow MDM server to connect

Password: Provide password of above user.

 

 

Now you are connected to MDM Server for further work.

 

<<Troubleshooting is yours, if any>>

 

MDM Directory

Global hostand shared with the network share sapmnt.

The global host is the host where the primary MDS instance is installed.

On global hosts, the \usr\sap directory contains general SAP software, global and local (instancespecific)

  1. data.

SAPinst creates the global directory usr\sap\<SAPSID>\SYS. There is exactly one physical

directory for each SAP system. It consists of the following subdirectories:

- global – contains globally shared data

- profile – contains the profiles for all instances

- exe – contains the executable replication directory for all instances

· Local host and shared with the name saploc.

On local hosts, the \usr\sap\<SAPSID>\<instance_name> directory contains copies of the

SAP software and local (instance-specific) data.

Under \usr\sap\<SAPSID>\<instance_name>\config, for example, you find the MDMspecific

configuration files for MDM servers:

- mds.ini

- mdis.ini

- mdss.ini

In the MDS instance folder under mdm, you find the following MDS-specific sub-folders:

- accelerators

- archives

- distributions

- reports

 

 

 

Entries in the Services File Created by SAPinst

Once the installation is complete, SAPinst creates the following entries in:

 

<drive:>\WINDOWS\system32\drivers\etc\services:

 

sapdpXX = 32XX/tcp

sapdbXXs = 47XX/tcp

sapgwXX = 33XX/tcp

sapgwXXs = 48XX/tcp

 

where XX is set from 00 to 99.

 

 

Below is the installation Guide will help you further.

https://websmp108.sap-ag.de/~sapidb/011000358700001668482008E

Getting Started with SAP NetWeaver Master Data Management

$
0
0

Today, large and mid-scale companies operating on the base of diversified IT landscapes often suffer from master data that is inconsistently stored in multiple, disconnected systems or databases. Unmanaged master data is notoriously inaccurate, redundant, and full of discrepancies, all of which result in high maintenance costs, vulnerable business processes and poor business decisions.

SAP NetWeaver Master Data Management (MDM) enables companies to consolidate and harmonize their master data within heterogeneous IT landscapes. It consistently delivers vastly reduced data maintenance costs, ensures cross-system data consistency, accelerates the execution of business processes, and greatly improves decision-making.

 

 

Overview

SAP NetWeaver Master Data Management Solution Brief

This solution brief provides an overview of the master data management features and processes of SAP NetWeaver Master Data Management.

 

SAP NetWeaver Master Data Management: Animated Overview   (Time 00:02:30) 

An animated flash presentation providing an overview of SAP NetWeaver Master Data Management and its usage scenarios.

 

Watch MDM and BPM in Action   (Time 00:05:45) 

This animated demo shows how to flexible compose a governed data creation process based on SAP NetWeaver BPM and SAP NetWeaver MDM, and why this is beneficial in heterogeneous landscapes.

 

Getting Started with MDM Guides  

Release 5.5  |  Release 7.1

This document gives a high-level overview of SAP NetWeaver Master Data Management (MDM), including its functional components and main features. A tutorial specifically dedicated to new MDM customers provides a guided tour through the product and allows the readers to get hands-on experience.

MDM Tutorial Sample Data 

 

Release 5.5Release 7.1

This package contains the sample data required to go through the step-by-step tutorial for new MDM customers:

 

SAP NetWeaver MDM Overview   

This presentation provides an overview of SAP NetWeaver MDM including main capabilities, applicable scenarios, and customer examples. For deeper insight into the master data integration, master data operation and master data quality capabilities, see the level 2 presention.

 

SAP NetWeaver Master Data Management 7.1   

Get information on the current release, SAP NetWeaver MDM 7.1.

 

More Information

Misaligned Master Data is a Compromised Corporate Asset  

Read this SAPInsider article to get a comprehensive overview of SAP NetWeaver Master Data Management in a business context.

SAP MDM Global Data Synchronization

$
0
0

Collaborative business Based on Accurate and Consistent Data

SAP MDM GDS

SAP NetWeaver Master Data Management enables global data synchronization (GDS), supporting product data consistency, quality, and distribution between trading partners.

Automated and workflow guided, it lets you manage data and communication in an integrated central console for better supply chain operations according to Global Data Synchronization Network (GDSN) terms.

 

Getting Started with Global Data Synchronization

Global Data Synchronization - Solution Brief 

The Global Data Synchronization Solution Brief provides a high-level overview of this SAP NetWeaver MDM based business scenario.

 

Overview of SAP GDS 2.1  

Presentation outlining the features and enhancements of Global Data Synchronization, release 2.1.

 

Colgate-Palmolive on the Benefits of Global Data Synchronization   (Time 03:55)

Colgate-Palmolive speaks about the benefits of Global Data Synchronization with SAP NetWeaver MDM.

 

FEATURED EVENTS   (Time 38:00)

A tutorial eLearning session providing an overview of Global Data Synchronization with SAP NetWeaver MDM (based on the GDS 1.0 release).

 

More on Global Data Synchronization

Industry Standards Drive Business Network Transformation   

Industry standards bodies such as GS1 and RosettaNet are essential for companies to transform their business networks to support business processes that span these networks using disparate applications. SAP is closely aligned with these standards organizations and, along with our partners, has adopted these standards throughout our solution offerings.

 

Functional Documentation  

SAP Help Portal documentation describing features and functions of the current release, Global Data Synchronization 2.1.

 

Technical Documentation  (SMP Login Required)

Technical Guides about the current release, Global Data Synchronization 2.1.

Featured Content in SAP NetWeaver Master Data Management

$
0
0

What's New with Support Package 11 for SAP NetWeaver MDM 7.1

Find out about the enhancements coming with SAP NetWeaver MDM 7.1 SP 11. 03 January 2014

 

Join Customer Survey on SAP NetWeaver Master Data Management

Please help to continuously enhance SAP NetWeaver Master Data Management and take part in the current customer survey. 03 December 2013

 

Combining SAP NetWeaver MDM with SAP Lumira

Read this blog to find out how you can visualize master data information for analytical purposes using SAP Lumira. 20 November 2013

 


'Update Tuple Table'

$
0
0


Hi All,

 

I am creating this document for as per to answer for the following thread http://scn.sap.com/thread/3497138.

Update Tuple Table

 

If you want to add one more record without updating or replacing the previous, for this you need to make the multivalued field to yes for the tuple in console.

1.jpg

Once you changed this option from no to yes; now you can add multiple records for a tuple in Data manager directly, but if you have auto import for the table then you need to set Tuple update option in import manager for the specified map.

2.jpg

 

 

Set matching Tuple Fields:

1.     1) Open the specified map if exist or else import sample file and choose source and destination files in the import manager.

2.     2) Now map the fields in the Map Fields/Values.

3.     3) Now right click on the tuple filed and click Set Tuple update option.

3.jpg

  4) Now choose the Update (Null or Mapped) according to your business requirement.

4.jpg

  5) Now save map for new or save map update for existing map.

6.     6) If the Tuple is nested Hierarchy tuple then set matching tuple fields option has to parent node.

Kindly go through MDM Import Manager Reference guide for more details regarding set matching tuple fields.

How to Optimize Performance of SAP NW MDM System

$
0
0

Introduction

 

This document is compilation of setting and customization that can be done in order to improve the performance of your SAP NW MDM System. Every Parameter has definition with Recommended Value.

 

MDIS Global Parameter

 

Interval

The number of seconds MDIS waits after scanning ports before restarting scanning.

Recommended value: 300




Tracing Level

Tracing Level lets you customize the minimum severity level recorded to meet your own requirements

Available values:

a)debug

b)flow

c)info

d)warning

e)error

Recommended Value: Set it to Warning as it has least impact on performance of server.

 

 

 

MDIS Parameters Repository Level

 

Parameter "Inter-Chunk Delay MS":

If data imports are performed during online working hours then users cannot access the repository due to long-running imports, In order to resolve this problem set this parameter to a reasonable value to allow for repository access between imports of chunks. This parameter can be set for individual repository.

Recommended value: 5000


 

File Aggregation

In certain scenarios, it may be more efficient for MDIS to import files from a port in batches rather than as individual files. For example, if you are generating large numbers of small import files (containing one or two records each), MDIS can import these files faster by processing them collectively than it could by processing each file separately

Recommended value: 10-50



Chunk Size

The Chunk Size parameter defines the number of virtual extended records to include in a chunk. Typically, the smaller the chunk size the better, as smaller chunks can be processed faster within MDIS and also require less wait time in transactions between MDIS and the Master Data Server

Recommended value: 50000

 

 

 

No. of Chunks Processed In Parallel

The No. of Chunks Processed in Parallel parameter optimizes the way chunks are processed within MDIS. This parameter determines how many chunks MDIS can manage simultaneously.

Recommended value: 5-10

MDSS Global Parameter


Tracing Level

Tracing Level lets you customize the minimum severity level recorded to meet your own requirements

Available values:

a)debug

b)flow

c)info

d)warning

e)error

Recommended Value is Warning as it has least impact on performance of server.

 

 

 

Suppress unchanged records option

You can limit the records included in a syndication file to only those that have changed since the last time they were syndicated to the mapped remote system. This improves the performance of syndication server.

Recommendation: If your business requirements allow it, enable the option that suppresses unchanged records in the MAP properties in order to reduce the number of records involved in syndication and reduce job run-times. Record suppression can be done at remote system level or port level

 

 

 

Suppress records without key option

Recommendation: If your business requirements allow it, enable the 'Suppress Records Without Key' option in the MAP properties to reduce the number of records involved in syndication and reduce job run-times.

 

 

 

Import and Syndication schedule without overlaps

In case of multiple repositories in single server multiple repositories, you should be aware that imports and syndication are functions that are scheduled sequentially by the MDM Server. If many of these tasks are started at the same time, the processes are handled sequentially. This can cause performance bottlenecks.

Recommendation:

Imports should be scheduled so they are not performed during user operation time. Syndication also should not run during import processes. Try to schedule imports so that they do not overlap with syndication.




No parallel imports into multiple repositories

The import process will not only lock the repositories, but it also has to place short server locks on the MDM Server. For this reason, performance losses can be expected if you import data to several repositories at the same time.

Recommendation: In case of multiple repositories,   put together a schedule so that you do not have to import data to several repositories at the same time.

 

 


Repository Designing Considerations:

Sort Index:

Sort Index creates an in-memory sort index for the field and enables sorting on the UI in ascending or descending order.

Setting "Sort Index" to "Normal" for a large number of fields can have a negative impact on the repository loading time and main memory consumption.

Recommendation: Make only those fields sortable for which are required for matching and sorting.




Workflow Designing:

Workflows provide a powerful way to automate different processes within MDM System.

Recommendation: Avoid workflow loops as it degrades system performance. For workflow which are completed either delete them at end or archive them if report is necessary for audit purpose.

SRM-MDM Catalog Implementation – End to End Procedure

$
0
0

SRM-MDM Catalog Implementation – End to End Procedure

 

 

Applies to:

Standard SAP MDM 7.1 Console, SAP SRM 7.0, SRM–MDM catalog 3.0 SP11, SRM-MDM catalog content 3.0, MDM SRM UI App. For more information, visit the Master Data Management homepage.


Summary

The main purpose of this article is to show how to integrate SRM-MDM Catalog. A basic step by step guide to implement SRM-MDM Catalog.

 

Author:        Nitish Sharma

Company:    Mahindra Satyam

Created on:  21 june 2012


Author Bio

1.png 

 

Nitish Sharma has been associated with Mahindra Satyam for 15 months and has been a part of MDM practice. He is skilled in SAP MDM, SAP MDG, SRM MDM, Java, J2EE and completed his Bachelor’s degree in Computer Science.

                                                                                                

 

 

 

Table of Contents

 

1) Background

2) Prerequisites

3) MDM configuration

            Creating a mask

            Assigning records to mask

4) UI configuration

5) SRM configuration

6) Conclusion

7) Related Content

8) Copyright

 

1) Background


The purpose of this document is to show connectivity for SRM-MDM Catalog, creation of catalogs and creation of OCI mappings. Relevant screenshots are provided that explain the whole end to end procedure to set up SRM-MDM Catalog content.

 

  2) Prerequisites

 

System Type

Service

Pack Purpose

SAP MDM 7.1 Console

7.1.07.263

To setup SRMMDM Catalog repository

SAP SRM 7.0

SP9

To setup Catalog ID & call structure

SRM–MDM catalog 3.0 SP11

SP11

Catalog Repository

SRM-MDM catalog content 3.0

 

SRM MDM catalog content deployment

MDM SRM UI App

 

To configure OCI structure & search engine

 

  3) MDM configuration


Creating a mask

 

25.png


Assigning records to mask

26.png

 

  4) UI configuration

 

Enter all the details to login

 

4.png

 

Every user is assigned to a repository, select the user as maintained in Webservice.

 

5.png

The Configuration view shows the user specific configuration of SRM MDM Catalog. In the General pane, you can set Shopping options, Item Display Criteria, Search, Shopping Lists and OCI Mapping.

 

6.png

7.png

 

The OCI Mapping is shown in below screenshot.

8.png

9.png

Here under Customize Display->item lists, we select the fields which we want to deal with on the portal. Just select field and click on add.

10.png

For example

11.png

 

Here under Customize Display->Cart preview, we can check the preview of previous step.

 

12.png

 

Here under Customize Display-> compare, we can compare the positions of fields of selected fields in previous step with SRM-MDM repository fields. On the right hand side using up and down button we can adjust the positions of fields.

 

13.png

Under customize search tab, you can specify that on which fields search can be performed, and what should be the conditions like word “starts with n or m” etc.

 

14.png

15.png

For example

 

16.png

17.png

 

 

5) SRM configuration

 

Go to transaction SPRO(it requires SRM side authorization first to log in SRM)

 

18.png

Click on SAP Refrence IMG

19.png

 

Click on SAP Implementation->SAP SRM-> SRM Server->Master Data->Content Management-> Define external web services.

20.png

 

Enter the catalog web service id and description for the mask created in SAP MDM

 

 

21.png

After creating webservice id go to Standard call structure to assign relevant parameters which include repository name (to which catalog belongs), catalog name (mask name).

22.png

 

After assigning all the parameters now we have to assign user id to particular catalog. Go to pposa_bbp tcode. Click on select attribute, then ‘+’ symbol to add user id and catalog

24.png

 

  6) Conclusion

With the implementation of above procedure, now we can able to access catalog (mask created in mdm) on the portal, the id created in the last step can only access assigned catalog. The fields which are selected in UI implementation step will be visible in portal. The name of catalog (webservice id) given in webservice under standard call structure will also be visible. So, assigned user can now access the catalog and can make a shopping cart with the items assigned to particular catalog.

 

7) Related Content

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/806f3d56-0d29-2d10-9abf-c0df6f0fdfe8?quicklink=index&overridelayout=true

http://help.sap.com/saphelp_srmmdm10/helpdata/en/44/ec6f42f6e341aae10000000a114a6b/frameset.htm

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/b0940876-42fe-2d10-77be-a82aaa163e13?QuickLink=index&overridelayout=true

http://213.41.80.15/SAP_ELearning/OKEC/nav/content/011000358700000229332007E.PDF

 

 

8) Copyright

 

© Copyright 2012 SAP AG. All rights reserved.

No part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP AG. The information contained herein may be changed without prior notice.

Some software products marketed by SAP AG and its distributors contain proprietary software components of other software vendors.

Microsoft, Windows, Excel, Outlook, and PowerPoint are registered trademarks of Microsoft Corporation.

IBM, DB2, DB2 Universal Database, System i, System i5, System p, System p5, System x, System z, System z10, System z9, z10, z9, iSeries, pSeries, xSeries, zSeries, eServer, z/VM, z/OS, i5/OS, S/390, OS/390, OS/400, AS/400, S/390 Parallel Enterprise Server, PowerVM, Power Architecture, POWER6+, POWER6, POWER5+, POWER5, POWER, OpenPower, PowerPC, BatchPipes, BladeCenter, System Storage, GPFS, HACMP, RETAIN, DB2 Connect, RACF, Redbooks, OS/2, Parallel Sysplex, MVS/ESA, AIX, Intelligent Miner, WebSphere, Netfinity, Tivoli and Informix are trademarks or registered trademarks of IBM Corporation.

Linux is the registered trademark of Linus Torvalds in the U.S. and other countries.

Adobe, the Adobe logo, Acrobat, PostScript, and Reader are either trademarks or registered trademarks of Adobe Systems Incorporated in the United States and/or other countries.

Oracle is a registered trademark of Oracle Corporation.

UNIX, X/Open, OSF/1, and Motif are registered trademarks of the Open Group.

Citrix, ICA, Program Neighborhood, MetaFrame, WinFrame, VideoFrame, and MultiWin are trademarks or registered trademarks of Citrix Systems, Inc.

HTML, XML, XHTML and W3C are trademarks or registered trademarks of W3C®, World Wide Web Consortium, Massachusetts Institute of Technology.

Java is a registered trademark of Oracle Corporation.

JavaScript is a registered trademark of Oracle Corporation, used under license for technology invented and implemented by Netscape.

SAP, R/3, SAP NetWeaver, Duet, PartnerEdge, ByDesign, SAP Business ByDesign, and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP AG in Germany and other countries.

Business Objects and the Business Objects logo, BusinessObjects, Crystal Reports, Crystal Decisions, Web Intelligence, Xcelsius, and other Business Objects products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of Business Objects S.A. in the United States and in other countries. Business Objects is an SAP company.

All other product and service names mentioned are the trademarks of their respective companies. Data contained in this document serves informational purposes only. National product specifications may vary.

These materials are subject to change without notice. These materials are provided by SAP AG and its affiliated companies ("SAP Group") for informational purposes only, without representation or warranty of any kind, and SAP Group shall not be liable for errors or omissions with respect to the materials. The only warranties for SAP Group products and services are those that are set forth in the express warranty statements accompanying such products and services, if any. Nothing herein should be construed as constituting an additional warranty.

MDM_CLNT_EXTR :ECC Configurations

$
0
0

Author: Ankush Bhardwaj

Company: Infosys Technologies Limited

Date written: 18/07/2012

Author Bio: Ankush Bhardwaj is currently working as an MDM Consultant in Infosys Technologies Limited

Target readers: Sap MDM consultant, SAP PI Consultants, SAP ECC Consultants and Data Migration Teams

Keywords: SAP MDM, SAP, Master Data Extraction, MDM_CLNT_EXTR

 

 

Summary:

This document caters to the Business Scenario where Master Date is residing in ECC system and is required to be distributed to various target systems like SAP MDM or SAP or Non-SAP systems. It uses SAP PI system as intermediate systems that will accept the data from Sender system and convert the data in required format for destination system and then sends data to target system. This Document talks about the Landscape Description of the Master Data Extraction of Material Master Data from the SAP ERP System to Target systems. This document will provide all the minute configuration details of the SAP ECC system. This document will only deal with the Extraction procedure of Material Master Data from Backend SAP ECC System.

 

 

 

Table Of Contents:

 

        Glossary

  1. Creating a New Logical System for Receiver PI System (Some old system can also be used if we have created one earlier)
  2. Creating the Logical System for Sender R3 System & Assigning the Client
  3. Configuration of RFC Destination Pointing to PI System
  4. Configuration of Communication Port (Transactional Port to be mapped with RFC Destination)
  5. Configuration of Partner Profile (For Outbound messages)
  6. Creation of ALE distribution Model/Customer Distribution Model

        References

 

 

 

Glossary

 

ALE - The technology for setting up and operating distributed applications.

Application Link Enabling (ALE) facilitates the distributed, but integrated, installation of SAP systems. This involves business-driven message exchange using consistent data across loosely linked SAP applications. Applications are integrated using synchronous and asynchronous communication - not by using a central database.

 

ALE consists of the following layers:

Application services

Distribution services

Communication services

 

Delta Mode - The Delta Mode is a specific distribution mode for an extraction run. A delta extraction run uses the system’s change pointers to extract and distribute only those master data objects that have been changed since the last extraction run. In delta mode, the extraction framework performs some mandatory checks. As the delta modes reuses the selection and transfer criteria of an initial extraction run for a master data object, it is mandatory that this object has at least been initially extracted once. Otherwise the object will be excluded from the delta extraction run. In addition, you cannot define selection and transfer criteria in the delta mode.

 

Extraction Object - An Extraction Object is a valid implementation of an object specific extractor. It consists of the extraction logic how to read the master data from the system storage according to the defined selection criteria and the system/object specific implementation of the communication technology used for sending the master data to the receiver system. As a common rule, the SAP delivered object extractors for R/3 & ERP systems use ALE IDoc technology, whereas object extractors for CRM and SRM systems use the ABAP Proxy XML Message technology.

 

Distribution Mode - The distribution mode defines how the data is extracted from the SAP system and sent to the target receiving system. The extraction framework is able to support multiple, different distribution modes as the mode to be used is part of the extraction variant object configuration. The actual implementation of a distribution mode has to be done specifically by the extraction object – this implementation is not part of the generic extraction framework. The framework supports implementing an Initial and a Delta Mode only, including the generic handling of selection and transfer criteria for these modes.

 

Extraction Run - An Extraction Run is the execution of a maintained Extraction Variant. It retrieves the selected master data objects from the system and sends it to the chosen target system. An Extraction Run can either be executed in Initial or Delta Mode.

 

Extraction Variant - An Extraction Variant is the prerequisite to start an Extraction Run. The Extraction Variant consists of a name and description, the extraction object, the selection and transfer criteria, the extraction mode, the target receiving system and the resulting message size. It is the configuration of the master data extraction. Extraction Variants can be stored in the system.

 

 

Initial Mode - The Initial Mode is a specific distribution mode for an extraction run. Initial Mode executes the first, complete extraction and distribution of the chosen master data object. The objects are extracted from the system’s storage according to the given selection criteria, independent from change pointers. The scope of the distributed object is sized according to the given transfer criteria. An extraction run in initial mode is a prerequisite for extraction runs in delta mode.

 

 

 

Steps of ALE Configurations:

 

1. Creating a New Logical System for Receiver PI System


For the ALE Configuration the first step is the creation of a Logical System for Receiver PI system.

 

Configuration Steps:

  • Go to Transaction SALE/BD54: Create New Logical System for receiver PI system.

 

                             


Figure 1: Initial Screen for t-code “SALE”

 

                             

 

  • Click New Entries on below screen to create new logical system.
  • Press SAVE button to save new creation.

 

                        

Figure 2: Screen to create new Logical Systems

 

 

  • New Logical System created.

 

                        

 

 

 

 

 

2. Creating the Logical System for Sender R3 System & Assigning the Client

 

Configuration Steps:

Go to Transaction SCC4: Linking Sender System to Client

 

  1. BASIS team generally has access to SCC4 as it is administration related tCode.
  2. Logical System name for the given R/3 or ECC system needs to be checked and link to required Client if necessary.
  3. This Logical System name should be used as Sender ECC LS.

 

 

 

 

 

3.Configuration of RFC Destination Pointing to PI System


A RFC Destination is to be created with the exact same name as the LS name for the Receiver PI System. This RFC Destination will point to the receiver PI System which would be used in the integration Landscape.

 

Configuration Steps:

 

  • T-Code is SM59/SALE.

                                                                           

 

Figure 3: Initial Screen of SM59 for RFC Configurations


 

  • Click on ABAP Connections and press Create icon. Below screen pops up:

                                                                         

 

Figure 4: Initial Screen to create RFC Destination


 

  • Enter values as shown below:

 

                                                      

 

Figure 5: Technical Settings for RFC Destination


 

  1. Target Host could be entered as IP Address. Once you save it, it will appear as name automatically.
  2. RFCDES is the table that was referred for entering IP address and System.

 

                                                               

 

Figure 6: Logon & Security Settings for RFC Destination


 

  1. This user should be some existing user of this Destination System and not the current system that is being used now. For other tabs, default settings will be used.
  2. Save RFC Destination.

 

 

 

 

4. Configuration of Communication Port (Transactional Port to be mapped with RFC Destination)

 

A Transactional RFC Communication Port is to be created which is used for R/3 systems. Then the port needs to be mapped with the RFC Destination.

 

Configuration Steps:


  • T-Code is WE21:

 

Figure 7: Initial Screen for creating Port


 

  • Click on Transactional RFC and press Create icon. Give name of Port and press ‘Enter’.

 

 

  • Fill in required fields and save the Port.

 

 

Figure 8: Technical Settings for Port

 

 

 

 

 

5. Configuration of Partner Profile (For Outbound messages)


Partner Profile needs to be configured for Outbound Message types for Material Master that is MATMAS05 and MDMRECEIPT.

 

Configuration Steps:

 

  • T-Code is WE20: Select Partner Type LS and press Create icon.

 

 

Figure 9: Initial Screen for Creating Partner Profiles

 

  • Enter Details as below:

 

Figure 10: Outbound Parameters and other settings

 


  • For maintaining Outbound Parameters, press highlighted icon and below screen will appear:
  • Enter data as shown in the screenshot:

 

 

Figure 11: Screen for Outbound Parameter settings of Partner Profile

 

 

  • Similarly add MDMRECEIPT Message Type as well in the Partner Profile Outbound Parameters.

 

  • MDMRECEIPTis an old message type that was used in MDM 2.0 and MDM 3.0. As T-Code MDM_CLNT_EXTRstill has to support the old MDM releases, so we have to configure this IDoc type in ALE, although it is not used in standard MDM 5.5 and 7.1 scenarios. In case we do not configure MDMRECEIPT, then extraction can fail with below error message:

        “BI 003 – Could not determine recipients for message type MDMRECEIPT”.

 

 

 

 

 

6. Creation of ALE distribution Model/Customer Distribution Model


This step needs to be performed for creating communication channel from Sender system to receiver system.


A customer distribution model needs to be created with a Model Name & then the MATMAS & MDMRECEIPT Message Type should be added there under the Sender PI System & Receiver R3 System.

 

Configuration Steps:

 

  • T-Code is BD64: Go to Change Mode and Click “Create Model View”

 

 

Figure 12: Initial Screen for Creating Distribution Model view

 


Figure 13: Pop-up screen for Creating Distribution Model

 

  • Select Model View and Click “Add Message Type”

Figure 14: Pop-up screen for Assigning Message Types to Model view

 

  • Similarly add MDMRECEIPT Message Type as well.
  • After all these steps, it will appear like this in Model View:

Figure 15: Complete Distribution Model

 

 

  • Generate Partner Profile for this Distribution Model:

Figure 16: Path for Generating Partner Profile for Distribution Model

 

  • Below Screen appears then click Execute:

Figure 17: Selection screen for generation of Partner Profile

 

  • Log will be displayed as shown below:

 

 

  • Save Model View and Then Distribute the Model to Target System:

 

Figure 18: Path for distributing model to target system

  • Select Destination System
  • Log is displayed as follows:

 

 

 

 

 

 

7. Create Variant for Initial Extraction(Selection Criteria and Transfer segments are defined)


The Master Extractor needs to be configured by setting the variants & using the LS name of the ALE settings done above.

The Variant Name is according to Material Master Data & a short description is also given. The Extraction Object is given according to the Message type of the Master which is to be exported. In the Target system the Logical System Name of the Receiver PI system is given. The Mode is kept Initial if it is a first time Extraction. The Block size is selected on the basis of how many records are to be bundled together in one IDOC.


Customizing Steps:

 

 

  1. T-Code is MDM_CLNT_EXTR:

 

  1. Enter name of Variant, Description.
  2. Material_Extract is the Extraction Object for extracting data for Materials.
  3. Target System would be Receiver PI System.
  4. Distribution Mode decides if you want to extract Initial Load Data or Delta Extraction.
  5. Block size decides the number of records to be combined in single MATMAS Idoc.
  6. After entering these values, Press Enter.

 

Figure 20: Initial Screen for MDM_CLNT_EXTR


 

  1. Field Groups will appear. Here you can decide Selection Criteria as well as Transfer Criteria:

 

Figure 21: Screen for entering Selection criteria and transfer selection

 

 

 

  • Double Click on this selection segment E1MARAM and below screen will appear:

 

 

  • Mention Selection Criteria for Extraction Program.

 

Note:In case you are entering some value range for Numeric Type, then you need to enter the value including Leading Zeroes for the concerned field. If this is not done, then extraction will start but message will be generated saying “No Data Found”. For Example, we need to maintain Material Number – MATNR as 000000000070000000 (total 18characters) and not only 70000000 (8 characters).

 

  • By selecting the checkbox under heading “Transfer” will ensure that given segment is included during the extraction and idoc creation.

 

  • Then SAVE variant and click “Start Extraction”.
  • Background Job would be scheduled to extract data.

 

         

 

  • Click on “Display Jobs” and all related jobs will be displayed at the screen:

 

Figure 22: Batch Jobs executed for MDM_CLNT_EXTR


 

  • Select Job and see Job Log and it would appear like this:

 

Figure 23: Log display for Batch Job


 

 

 

 

 

 

 

 

  • We can go to WE02 and see idocs posted after Extraction.

 

Figure 24: Idoc display for extracted idocs

 

Software Check:


 

 

 

 

 

 

 

 

 

 

 

 

 

  • We need to ensure that Extraction Framework and Extraction Object Software Packages are installed in the system. Without which Extraction will not happen.

 

 

  • With this, all the configuration settings needed to be done for extraction of Material Master are complete. Further Settings in PI and MDM system needs to be done to automate the process completely.

 

 

 

References

Working with Tuples using MDM JAVA API

$
0
0

Introduction

This document describes how to work with tuple in MDM.

A tuple is a hierarchical structure in MDM. It can’t exist by itself, it has a parent table/tuple. It can be a nested structure with one to many relation at every level. e.g A vendor can have multiple contact information.

So contact information (address, phone number, etc) can be grouped under a tuple which will be part of vendor table.

For modifying/creating tuple data, you need to first get the parent record in main table which holds the tuple.

 

 

Creating new record with Tuple:

Folllowing is the sample code to create new record in mdm with multiple tuplevalues.

TupleDefinitionId definitionId = reposSchema.getTupleDefinition(tupleName).getId();   
TupleDefinitionSchema tupleDefinitionSchema = reposSchema.getTupleDefinitionSchema(definitionId);   
FieldProperties[]  tupleFieldProps=tupleDefinitionSchema.getFields();     
CreateRecordsCommand createRecordsCommand=new CreateRecordsCommand(userCtx);        
//Create one array of record instance with the number of records you want to insert in MDM table   
Record[]records = new Record[collection.size()];     
//loop on your input collection and call following method to set tuple values for each record     
Record record = RecordFactory.createEmptyRecord(tableId);   
records[i]=record;   
MultiTupleValue multiTupleValue = new MultiTupleValue();                        
TupleValue tupleVal= MdmValueFactory.createTupleValue();   
//Now loop on your tuple value set and get value for each field in tuple   
//set it into the tuplevalue instance   
for(int k=0;k<tupleFieldProps.length;k++)   
{         String tupleFieldName=tupleFieldProps[k].getCode();   
// here we have the new values stored in one object.      String tupleValue = obj.getFieldValue(tupleFieldName);         if(tupleValue!=null){               String fieldType =tupleFieldProps[k].getTypeName().toString();         
if (fieldType.equals("Text"))    
{         tupleVal.setFieldValue(tupleFieldProps[k]         .getId(), new StringValue(tupleValue));     
}   
elseif (fieldType.equals("Real"))    
{                                                       FloatValue fValue = new FloatValue(Float                                         .parseFloat(tupleValue));                tupleVal.setFieldValue(tupleFieldProps[k]                                            .getId(), fValue);                                                               
}   
elseif (fieldType.equals("Integer")) {                                       if(tupleValue!=null&&!"".equals(tupleValue.toString()))         {              IntegerValue intvalue = new     IntegerValue(Integer.parseInt(tupleValue));              tupleVal.setFieldValue(tupleFieldProps[k].getId(), intvalue);         }      }     elseif (fieldType.equals("Lookup [Flat]")) {             tupleVal.setFieldValue(tupleFieldProps[k]                                   .getId(), new LookupValue(new RecordId(                                                             tupleValue)));         }     elseif(((tupleFieldProps[k].getTypeName()).toString()).equals("Literal Date")){              try {                   SimpleDateFormat df = new SimpleDateFormat("MM/dd/yyyy");                    String effectiveDate = tupleValue;                   Calendar cal=Calendar.getInstance();                   cal.setTime(df.parse(effectiveDate));                   tupleVal.setFieldValue(tupleFieldProps[k].getId(), new DateTimeValue(cal));                                                            }    
catch (Exception e)    
{                     e.printStackTrace();                                                                  }               }                              }               }       
//Add Tuple value to the multiple tuple value instance        multiTupleValue.addValue(tupleVal);   
//End of loop for tuple value set   
//Now set multiple tuple value instance to the tuple field of the record             record.setFieldValue(tableSchema.getFieldId(tupleName), multiTupleValue);   
//End of loop for main table collection   
//At last set records to the command instance and execute       createRecordsCommand.setTableId(tableId);      createRecordsCommand.setRecords(records);      createRecordsCommand.execute();  

 

Modifying Record with tuple:

For modifying existing records, you need to first fetch old records from table. Then for each record, you can get the tuplevalue instance which can be upadated the same way as we did in create method.Only the command type will change

 

MultiTupleValue multiTupleValue = (MultiTupleValue) record.getFieldValue(          tableSchema.getFieldId(tupleName));   
//For new records it was like following: MultiTupleValue multiTupleValue = new MultiTupleValue();               //Get all tuple values           MdmValue[] tuplevals=multiTupleValue.getValues();           //Now to update a particular row in a tuple value set, get  its instance from the array            TupleValue tupleVal=(TupleValue)tuplevals[ ind]                        
//Rest of the code will be same for setting values to the the tuplevalue     
//if you want to add a new tuple value to the value set create the tuple value instance as below        TupleValue tupleVal= MdmValueFactory.createTupleValue();     
//For new row in tuple value set you will have to add the instance to multituplevalue instance.          multiTupleValue.addValue(tupleVal);     
//for modifying existing tuple value only you need to update in the tuplevalue instance.       
//At last set records to the command                          TableId tableId=reposSchema.getTableId(tablename);                     modifyRecordsCommand.setRecords(updateRecs);                     modifyRecordsCommand.setTableId(tableId);                     modifyRecordsCommand.execute();  

 

Deleting Tuple Value:

For deleting tuple values also, you will need to get old record instance.

MultiTupleValue multiTupleValue = (MultiTupleValue)record.getFieldValue(   
tableSchema.getFieldId(tupleName));     
//Get old tuple value instances   
TupleValue[] tuplevals=multiTupleValue.getUntouchTupleRecords();       
//Delete old Tuple values          int vals_count=tuplevals.length;           for(int j=vals_count-1;j>=0;j--)                     {                               multiTupleValue.removeTupleValue(((TupleValue)tuplevals[j]).getTupleRecordId());                          }       

 

 

Nested Tuple:

Working with nested tuple is similar as above. Only for the child tuple, you need to get the tuple instance from its parent tuple.

MultiTupleValue childTupleValue=((MultiTupleValue)outerTupleVal.
getFieldValue(tupleFieldProps[k].getId()));
MdmValue[] childTupleVals= childTupleValue.getUntouchTupleRecords();
//Remove tuple value
childTupleValue.removeTupleValue(((TupleValue) childTupleVals[j]).getTupleRecordId());
//Get the Field properties of the child tuple to set field values for the //child tuple
TupleDefinitionId definitionId2 = reposSchema.getTupleDefinition("ChildTupleName").getId();;
TupleDefinitionSchema tupleDefinitionSchema2 = reposSchema.
 getTupleDefinitionSchema(definitionId2);
FieldProperties[] fieldProperties2=tupleDefinitionSchema2.getFields();   
//Add new tuple value
TupleValue  newValue=MdmValueFactory.createTupleValue();
//set field values to the tuple instance
childTupleVals.addValue(newValue);
//For modifying existing get the tuple value instance, set the field value 
 //by looping on fieldproperties.
Viewing all 38 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>