Informatica MDM Interview Questions and Answers

Q1 : What the term MDM means?
A : MDM stands for Master Data Management. It is a comprehensive method used to enable an enterprise for linking all of its critical data to single file also known as master file, providing a common point of reference. When done in a proper manner, MDM helps in streamlining the process of data sharing among departments and personnel.

Q2 : Differentiate between variable and mapping parameter.
A : >> A Mapping variable is dynamic, i.e. it can vary anytime throughout the session. The variable’s initial value before the starting of the session is read by PowerCenter, which makes use of variable functions to change the value. And before the session ends, it saves the current value. However, the last value is held by the variable itself. Next time when the session runs, the value of the variable is the last saved value in the previous session.
>> A Mapping parameter is a static value, defined by you before the session starts and the value remains the same until the end of the session. Once the session runs, PowerCenter evaluates the parameter’s value and retains the same value during the entire session. Next time, when the session runs, it reads the value from the file.

Q3 : Define Informatica PowerCenter.
A : Designed by Informatica Corporation, it is data integration software providing an environment that lets data loading into a centralized location like data warehouse. From here, data can be easily extracted from an array of sources, also can be transformed as per the business logic and then can be easily loaded into files as well as relation targets.

Q4 : What are the ways for deleting duplicate record in Informatica?
A : There are several ways for deleting duplicate record in Informatica. They are as follows:

1. Making use of select distinct in source qualifier
2. Making use of group and aggregator by all fields
3. By overriding SQL query in source qualifier

Q5 : Explain OLTP.
A : OLTP stands for Online Transaction Processing that helps in modifying data the example it receives as well as having a huge number of concurrent users.

Q6 : Describe the parallel degree of data loading properties in MDM.
A : This specifies the parallelism’s degree that is set upon the base object table as well as its related tables. Although it doesn’t occur for all batch processes, it can have a positive consequence on performance once it’s used. Nevertheless, its use is restricted by the number of CPUs on the database server machine along with the amount of available memory. 1 is the default value.

Q7 : What is the expiration module of automatic lock-in Informatica MDM?
A : In every 60 seconds, the hub console is refreshed in the current connection. A lock can be released manually by a user. In case the user switches to another database while having a hold of a lock, then the lock will be released automatically. In case the hub console is terminated by the user, then the lock will be expired after a minute.

Q8 : Explain various types of LOCK used in Informatica MDM 10.1.
A : Two types of LOCK are used in Informatica MDM 10.1. They are:
1. Exclusive Lock: Letting just one user make alterations to the underlying operational reference store.
2. Write Lock: Letting multiple users make amendments to the underlying metadata at the same time.

Q9 : Name the tables that are linked with staging data in Informatica MDM.
A : There are various tables that are linked with staging data in Informatica MDM. They are:

1. Landing Table
2. Raw Table
3. Rejects Table
4. Staging Table

Q10Name various tools that require LOCK in Informatica MDM.
A : There are several tools that require LOCK to make configuration changes to the database of MDM Hub Master. They are:

1. Message Queues
2. Tool Access
3. Users
4. Security Providers
5. Databases
6. Repository Manager

Q11 : Name the tool which does not require Lock in Informatica MDM.
A : Merge manager, data manager, and hierarchy manager do not demand to write locks. Besides. The audit manager also does not need write locks.

Q12 : What is the way to find all the invalid mappings in a folder?
A : By using a query all the invalid mappings in a folder can be found. It is:
SELECT MAPPING_NAME FROM REP_ALL_MAPPINGS WHERE
SUBJECT_AREA=’YOUR_FOLDER_NAME’ AND PARENT_MAPPING_IS_VALIED <>1

Q13 : Name various data movement modes in Informatica.
A : A data movement mode helps in determining how power center server takes care of the character data.  Data movement is selected in the Informatica server configuration settings. There are two different data movement modes available in Informatica. They are:

** Unicode Mode and ASCII Mode
** Explain OLAP.
** OLAP stands for Online Analytical Processing. It processes as an app helps that gathers, manages, presents and processes multidimensional data for management and analysis purposes.

Q14 : Define Data Mining.
A : It is a process that helps in analyzing data from several perspectives and also allows summarizing it into helpful information.

Q15 : Describe various repositories that can be generated using Informatica Repository Manager.
A : There are various repositories that can be formed with the help of Informatica Repository Manager. They are as follows:
1. Standalone Repository: It is a repository functioning individually as well as is not related to any other repositories.
2. Local Repository: This repository functions within a domain. It is able to connect to a global repository with the help of global shortcuts. Also, it can make use of objects in its shared folders.
3. Global Repository: This repository works as a centralized repository in a domain. It contains shared objects crossways the repositories in a domain.

Q16 : Name various objects that can’t be used in mapplet.
A : There are a number of objects that you cannot use in a mapplet. They are:

1. Joiner transformations
2. COBOL source definition
3. Target definitions
4. IBM MQ source definitions
5. XML source definitions
6. Normaliser transformations
7. Non-reusable sequence generator transformations
8. Power mart 3.5 styles Lookup functions
9. Post or pre-session stored procedures

Q17 : Define Dimension Table. 
A : Dimension table is a compilation of categories, hierarchies, and logic which can further be used for the traverse purpose in hierarchy nodes. It includes textual attributes of measurements that are stored in fact tables.

Q18 : Describe different methods to load dimension tables.
A : Two different methods are there for loading data in dimension tables. They are as follows:
>> Direct or fast: In this method, all the keys and constraints are disabled prior to loading the data. Once the complete data is loaded, it is legalized against all the keys and constraints. In case the data is found to be invalid then it will not be included to index. Plus, all the future processed on this data is skipped.
>> Conventional or slow:  In this method, all the keys and constraints are legalized against prior to the data is loaded. In this way, it helps in maintaining data integrity.

Q19 : Explain Mapplet.
A : It is a reusable object containing set of transformations and also allowing to reuse that transformation logic in a wide range of mappings.

Q20 : Define Data Warehousing.
A : It is the main depot of an organization’s historical data and its corporate memory, containing raw material for the decision support system of management. What lead to the use of data warehousing is that it allows a data analyst to execute complex queried and analysis like data mining on the info without making any slow in operational system.  Collection of data in Data warehousing is planned for supporting decision making of management. These warehouses contain an array of data presenting a coherent image of business conditions in time at a single point. Data Warehousing is a repository of information that is available for analysis and query.

Q21 : Describe various fundamental stages of Data Warehousing. 
A : There are various fundamental stages of Data warehousing. They are:

1. Offline Operational Databases: This is the first stage in which data warehouses are developed simply by copying operational system database to an offline server where the dealing out a load of reporting not put any impact in the performance of operational system.
2. Offline Data Warehouse: In this stage of development, data warehouses are updates in regular basis from the operational systems. Plus, all the data is stored in an incorporated reporting-oriented data structure.
3. Real Time Data Warehouse: During this stage, data warehouses are updated on an event or transaction basis. Also, an operating system executes a transaction every time.
4. Integrated Data Warehouse: This is the last stage where data warehouses are used for generating transactions or activity passing back into the operational system for the purpose of use in an organization’s daily activity.

Q22 : Define Dimensional Modeling.
A : There are two types of table involved in Dimensional Modeling and this model concept is different from the third normal form. Dimensional data model concept makes use of facts table containing the measurements of the business and dimension table containing the measurement context.

Q23 : Name various components of Informatica PowerCenter.
A : There are various components of Informatica PowerCenter. They are as follows:

1. PowerCenter Repository
2. PowerCenter Domain
3. PowerCenter Client
4. Administration Console
5. Integration Service
6. Repository Service
7. Data Analyser
8. Web Services Hub
9. PowerCenter Repository Reports
10. Metadata Manager

Q24 : Explain Transformation.
A : It is a repository object that helps on generating, modifying or passing data. In a mapping, transformations make a representation of the operations integrated with service performs on the data. All the data goes by transformation ports that are only linked with a mapplet or mapping.

Q25 : Define Fact Table.
A : Fact table is the process containing measurement of business processes along with the foreign keys for dimension tables.