IBM 

Test information:
Number of questions: 60
Time allowed in minutes: 90
Required passing score: 65%
Languages: English, Japanese

Related certifications:
IBM Certified Solution Designer – InfoSphere Warehouse V9.5

This certification exam certifies that the successful candidate has important knowledge, skills, and abilities necessary to perform the intermediate and advanced skills required to design, develop, and support InfoSphere Warehouse V9.5 applications.

Section 1 – Architecting Warehouse Solutions (15%)
Demonstrate knowledge of InfoSphere Warehouse architecture and components
Editions
Software Components (why/when to use)
Describe the InfoSphere Warehouse building life-cycle
Steps to build and deploy the application(s)

Section 2 – Implementation (Table Ready) (5%)
Describe hardware topologies
Given a scenario, demonstrate how to implement security considerations

Section 3 – Physical Data Modeling (15%)
Given a scenario, demonstrate knowledge of the modeling process and the Design Studio features used
Identify physical design methods
Compare and synchronize
Impact analysis
Components
Enhancing the model
Given a scenario, describe range/data partitioning considerations
When is it appropriate to use
Cost

Section 4 – Cubing Services (CS) (20%)
Demonstrate knowledge of Cubing Services components
Cube server
Design Studio
MQT administration
Given a scenario, describe CS tooling and access methods
Demonstrate knowledge of CS optimization advisor
Identify the steps in creating a CS OLAP cube
Metadata
Creation of cube model and cube
Demonstrate knowledge of CS administration
Deploying cubes to cube server
Deploying cubes across multiple servers
Caching

Section 5 – Data Mining/Unstructured Text Analytics (12%)
Given a scenario, demonstrate knowledge of data mining and unstructured text analytics in InfoSphere Warehouse V9.5
Given scenario, describe the InfoSphere Intelligent Miner methods and how to use them
The mining process
Modeling
Scoring
Visualization
Demonstrate how to use Design Studio to implement mining methods
Mining unstructured text data – what do you do with it after it is extracted
Describe the unstructured text analytic information extraction process
Using JAVA regular expressions
Dictionary

Section 6 – SQL Warehousing Tool (SQW) (20%)
Demonstrate knowledge of SQW components
Data flows
Control flows
Mining flows
Variables
Versioning
Describe SQW anatomy
Operators
Ports
Connectors
Given a scenario, describe the SQW debugging functions

Section 7 – Run-time Administration and Monitoring of the Warehouse (13%)
Identify the application preparation steps for deployment
Describe the InfoSphere Warehouse components managed by Admin console
Demonstrate knowledge of managing, monitoring, and scheduling processes in Admin console
Given a scenario, demonstrate knowledge of workload management and monitoring
Difference between workload and classes
Controlling types of queries
Performance Expert

IBM Certified Solution Designer – InfoSphere Warehouse V9.5

Job Role Description / Target Audience
This certification exam certifies that the successful candidate has important knowledge, skills, and abilities necessary to perform the intermediate and advanced skills required to design, develop, and support InfoSphere Warehouse V9.5 applications. Applicable roles include: Solutions Architect, Data Warehouse Developers, and Database Administrator (in a data warehousing environment)

Requirements
This certification requires 1 test(s).

Test(s) required:
Click on the link(s) below to see test details, test objectives, suggested training and sample tests.

Test C2090-719 – InfoSphere Warehouse V9.5

QUESTION 1
What are two reasons for a combination of database and front-end tool based analytic
architectures in a data warehouse implementation? (Choose two.)

A. Less data is moved across the network, making queries run faster.
B. The database can provide consistent analytic calculations and query speed for common queries.
C. The combination of architectures will ensure fast query performance.
D. Multidimensional queries cannot be processed in SQL by the database engine so it must be done using a front-end tool.
E. The front-end tool allows for additional and more complex algorithms specific to applications that use that tool.

Answer: B,E

Explanation:


QUESTION 2
After deploying an application, you might need to update it by making changes to one or more
data flows. Deploying changes to an existing application is called delta deployment. How do you
package changes using delta deployment?

A. Package only the operator or property that has changed.
B. Package the data flow that has changed.
C. Package the control flow.
D. Package all the items that were originally packaged and use the same profile that was used.

Answer: C

Explanation:


QUESTION 3
You are implementing a DB2 Workload Manager (WLM) schema to limit the number of load
utilities that can execute concurrently. Which WLM object would be used to accomplish this?

A. work class with an associated work action and an appropriate threshold
B. workload with an associated service class and an appropriate threshold
C. work class with an associated service class and an appropriate threshold
D. workload with an associated work action and an appropriate threshold

Answer: A

Explanation:


QUESTION 4
Several operators are defined and linked together in DataFlow1. Another set of operators make up
DataFlow2. A control flow is defined and both DataFlow1 and DataFlow2 are used. You require
that DataFlow1 dynamically change the variable values used in DataFlow2. How can you fulfill this
requirement?

A. The inherent design of the SQL Warehouse Tool is that any variable value changed in one data
flow is accessible by any other data flow as long as the data flows are defined in the same warehouse project.
B. Using the File Export operator, DataFlow1 writes a file that contains updated variable values.
DataFlow2 accesses those updated variable values by reading that same file using an Import File operator.
C. When a control flow is executed, a run profile provides the initial values for all variables. Once
those values are set in the run profile, they are in affect for the entire execution of the control flow.
D. Using the File Export operator, DataFlow1 writes a file, containing updated variable values. A
variable assignment operator is then used to assign the values in the file to the appropriate
variables. DataFlow2 then has access to the updated variable values.

Answer: D

Explanation:


QUESTION 5
Relational database and a database model that is often a star or snowflake schema are
characteristics of which engine storage structure?

A. MOLAP
B. ROLAP
C. Multidimensional cubing
D. Proprietary

Answer: B

 

Click here to view complete Q&A of C2090-719 exam
Certkingdom Review
, Certkingdom C2090-719 PDF

MCTS Training, MCITP Trainnig

Best IBM C2090-719 Certification, IBM C2090-719 Training at certkingdom.com

Click to rate this post!
[Total: 0 Average: 0]
News Reporter