C2150-500 IBM Security Dynamic and Static Applications V2 Fundamentals

0

Test information:
Number of questions: 57
Time allowed in minutes: 120
Required passing score: 58%
Languages: English, French, Latin American Spanish, Portuguese (Brazil)

Related certifications:
IBM Certified Solution Advisor – Security Dynamic and Static Applications V2

Section 1 – Application Security (20%)
Given a scenario, differentiate between DAST, SAST, and/or IAST.
Identify key or necessary triage tasks for DAST and SAST.
Given a scenario, demonstrate various reporting tasks.
Given a scenario, explain continuous delivery tasks, i.e., defect tracking, integrating with SDLC.
Identify AppScan Source remediation tasks.
Given a scenario, identify common web application vulnerabilities.
Identify types of external references that AppScan tool provides.

Section 2 – Competitive Analysis (7%)
Identify the competitive position of AppScan from the perspective of the Gartner Magic Quadrant.
Identify the strengths of the AppScan offering.
Identify the benefits of using AppScan tools, rather than their alternatives.

Section 3 – IBM Security Portfolio (10%)
Given a scenario, identify how AppScan fits into the IBM security framework.
Given a scenario, identify how AppScan fits into the IBM mobile security framework.

Section 4 – Software Development Lifecycle (17%)
Identify ways to integrate AppScan into a build process.
Given a scenario, demonstrate ways to integrate AppScan into a build process.
Identify where blackbox and whitebox solutions fit into secure SDLC.
Given a scenario, explain common development platforms (Ex. Java, .NET, C/C++).
Given a scenario, demonstrate the extensibility of AppScan tools.
Identify the extensibility of AppScan tools.

Section 5 – AppScan Product Knowledge (21%)
Given a scenario, explain how components of the AppScan suite are used in different deployments.
Given a scenario, determine if AppScan can provide a solution.
Identify potential deployment architectures.
Identify supported AppScan development frameworks.
Identify the advantages, purposes, and offerings of integrating AppScann with security tools.

Section 6 – Mobile Security (11%)
Identify the common types of mobile vulnerabilities.
Identify the mobile support platform for AppScan Source and integration with IBM Worklight.

Section 7 – Business Drivers (6%)
Given a scenario, demonstrate how AppScan can solve common problems.
Given a scenario, explain how AppScan can impact a company’s budget.
Given a scenario, explain Application security compliance drivers.

Section 8 – Licensing (8%)
Identify the required license structure for each component in AppScan.
Given a scenario, identify the licenses required for a specific deployment.

IBM Certified Solution Advisor – Security Dynamic and Static Applications V2

Job Role Description / Target Audience
This entry level certification is for solution advisors that are able to identify opportunities and influence direction across the AppScan portfolio. They recommend education, influence key decision makers, are able to respond to RFPs & RFQs, and understand licensing and pricing.

These solution advisors understand application security and competitive analysis, have knowledge of the broader IBM Security protfolio and the software development cycle, have the AppScan product knowledge, and understand mobility security, business drivers and licensing.

This is a technical sales role (CTP/pre-sales engineer) certification.
To attain the IBM Certified Solution Advisor – Security Dynamic and Static Applications V2 certification, candidates must pass 1 test. To gain additional knowledge and skills, and prepare for this test based on the job role and test objectives, take the link to the test below, and refer to the Test Preparation tab.

Recommended Prerequisite Skills
Have static analysis skills:
Read and program code
Configure source code to compile (build) an application
Remediate trivial errors in Java and .net apps: low hanging fruits
Have dynamic analysis skills:
Understand the web application architecture
Produce high-level deployment architecture solutions.
Write technically.
Comfortable discussing technical concepts with developers.
Comfortable discussing business and financial concepts with managers and executives.

Requirements
This certification requires 1 test(s).

Test(s) required:
Test C2150-500 – IBM Security Dynamic and Static Applications V2 Fundamentals

The test:
contains questions requiring single and multiple answers. For multiple-answer questions, you need to choose all required options to get the answer correct. You will be advised how many options make up the correct answer.
is designed to provide diagnostic feedback on the Examination Score Report, correlating back to the test objectives, informing the test taker how he or she did on each section of the test. As a result, to maintain the integrity of each test, questions and answers are not distributed.

Click here to view complete Q&A of C2150-500 exam
Certkingdom Review
, Certkingdom C2150-500 PDF

 

MCTS Training, MCITP Trainnig

 

Best IBM C2150-500 Certification, IBM C2150-500 Training at certkingdom.com

300-170 DCVAI Implementing Cisco Data Center Virtualization and Automation

0

Exam Number 300-170 DCVAI
Associated Certifications CCNP Data Center
Duration 90 minutes (60-70 questions)
Available Languages English

This exam tests a candidate’s knowledge of implementing data center infrastructure including virtualization, automation, Cisco Application Centric Infrastructure (ACI), ACI network resources, and, ACI management and monitoring.

Exam Description
The Implementing Cisco Data Center Virtualization and Automation (DCVAI) exam (300-170) is a 90-minute, 60–70 question assessment. This exam is one of the exams associated with the CCNP Data Center Certification. This exam tests a candidate’s knowledge of implementing Cisco data center infrastructure including virtualization, automation, Application Centric Infrastructure, Application Centric Infrastructure network resources, and Application Centric Infrastructure management and monitoring. The course, Implementing Cisco Data Center Virtualization and Automation v6 (DCVAI), helps candidates to prepare for this exam because the content is aligned with the exam topics.

The following topics are general guidelines for the content likely to be included on the exam. However, other related topics may also appear on any specific delivery of the exam. In order to better reflect the contents of the exam and for clarity purposes, the guidelines below may change at any time without notice.

1.0 Implement Infrastructure Virtualization 19%

1.1 Implement logical device separation

1.1.a VDC
1.1.b VRF

1.2 Implement virtual switching technologies

2.0 Implement Infrastructure Automation 16%

2.1 Implement configuration profiles

2.1.a Auto-config
2.1.b Port profiles
2.1.c Configuration synchronization

2.2 Implement POAP

2.3 Compare and contrast different scripting tools

2.3.a EEM
2.3.b Scheduler
2.3.c SDK

3.0 Implementing Application Centric Infrastructure 27%

3.1 Configure fabric discovery parameters

3.2 Implement access policies

3.2.a Policy groups
3.2.b Protocol policies
3.2.b [i[ LLDP, CDP, LCAP, and link-level
3.2.c AEP
3.2.d Domains
3.2.e Pools
3.2.f Profiles
3.2.f [i] Switch
3.2.f [ii] Interface

3.3 Implement VMM domain integrations

3.4 Implement tenant-based policies

3.4.a EPGs
3.4.a [i] Pathing
3.4.a [ii] Domains
3.4.b Contracts
3.4.b [i] Consumer
3.4.b [ii] Providers
3.4.b [iii] vzAny (TCAM conservation)
3.4.b [iv] Inter-tenant
3.4.c Private networks
3.4.c [i] Enforced/unenforced
3.4.d Bridge domains
3.4.d [i] Unknown unicast settings
3.4.d [ii] ARP settings
3.4.d [iii] Unicast routing

4.0 Implementing Application Centric Infrastructure Network Resources 25%

4.1 Implement external network integration

4.1.a External bridge network
4.1.b External routed network

4.2 Implement packet flow

4.2.a Unicast
4.2.b Multicast
4.2.c Broadcast
4.2.d Endpoint database

4.3 Describe service insertion and redirection

4.3.a Device packages
4.3.b Service graphs
4.3.c Function profiles

5.0 Implementing Application Centric Infrastructure Management and Monitoring 13%

5.1 Implement management

5.1.a In-band management
5.1.b Out-of-band management

5.2 Implement monitoring

5.2.a SNMP
5.2.b Atomic counters
5.2.c Health score evaluations

5.3 Implement security domains and role mapping

5.3.a AAA
5.3.b RBAC

5.4 Compare and contrast different scripting tools

5.4.a SDK
5.4.b API Inspector / XML

QUESTION 1
You have a Cisco Nexus 1000V Series Switch. When must you use the system VLAN?

A. to use VMware vMotion
B. to perform an ESXi iSCSI boot
C. to perform a VM iSCSI boot
D. to perform an ESXi NFS boot

Answer: A


QUESTION 2
Which option must be defined to apply a configuration across a potentially large number of switches in the most scalable way?

A. a configuration policy
B. a group policy
C. an interface policy
D. a switch profile

Answer: C


QUESTION 3
Which two options are benefits of using the configuration synchronization feature? (Choose two )

A. Supports the feature command
B. Supports existing session and port profile functionality
C. can be used by any Cisco Nexus switch
D. merges configurations when connectivity is established between peers O supports FCoE in vPC topologies

Answer: A,C

Click here to view complete Q&A of 300-170 exam
Certkingdom Review
, Certkingdom pdf torrent

MCTS Training, MCITP Trainnig

Best Cisco 300-170 Certification, Cisco 300-170 Training at certkingdom.com

300-175 DCUCI Implementing Cisco Data Center Unified Computing

0

Exam Number 300-175 DCUCI
Associated Certifications CCNP Data Center
Duration 90 minutes (60-70 questions)
Available Languages English
Register Pearson VUE

This exam tests a candidate’s knowledge of implementing data center technologies including unified computing, unified computing maintenance and operations, automation, unified computing security, and unified computing storage.

Exam Description
The Implementing Cisco Data Center Unified Computing (DCUCI) exam (300-175) is a 90-minute, 60–70 question assessment. This exam is one of the exams associated with the CCNP Datacenter Certification. This exam tests a candidate’s knowledge of implementing Cisco data center technologies including unified computing, unified computing maintenance and operations, automation, unified computing security, and unified computing storage. The course, Implementing Cisco Data Center Unified Computing v6 (DCUCI), helps candidates to prepare for this exam because the content is aligned with the exam topics.

The following topics are general guidelines for the content likely to be included on the exam. However, other related topics may also appear on any specific delivery of the exam. In order to better reflect the contents of the exam and for clarity purposes, the guidelines below may change at any time without notice.

1.0 Implement Cisco Unified Computing 28%

1.1 Install Cisco Unified Computing platforms
1.1.a Stand-alone computing
1.1.b Chassis / blade
1.1.c Modular / server cartridges
1.1.d Server integration

1.2 Implement server abstraction technologies
1.2.a Service profiles
1.2.a [i] Pools
1.2.a [ii] Policies
1.2.a [ii].1 Connectivity
1.2.a [ii].2 Placement policy
1.2.a [ii].3 Remote boot policies
1.2.a [iii] Templates
1.2.a [iii].1 Policy hierarchy
1.2.a [iii].2 Initial vs updating

2.0 Unified Computing Maintenance and Operations 20%

2.1 Implement firmware upgrades, packages, and interoperability

2.2 Implement backup operations

2.3 Implement monitoring

2.3.a Logging
2.3.b SNMP
2.3.c Call Home
2.3.d NetFlow
2.3.e Monitoring session

3.0 Automation 12%

3.1 Implement integration of centralized management

3.2 Compare and contrast different scripting tools

3.2.a SDK
3.2.b XML

4.0 Unified Computing Security 13%

4.1 Implement AAA and RBAC

4.2 Implement key management

5.0 Unified Computing Storage 27%

5.1 Implement iSCSI

5.1.a Multipath
5.1.b Addressing schemes

5.2 Implement Fibre Channel port channels

5.3 Implement Fibre Channel protocol services

5.3.a Zoning
5.3.b Device alias
5.3.c VSAN

5.4 Implement FCoE

5.4.a FIP
5.4.b FCoE topologies
5.4.c DCB

5.5 Implement boot from SAN

5.5.a FCoE / Fiber Channel
5.5.b iSCSI

QUESTION 3 – (Topic 1)
Which two statements are true concerning authorization when using RBAC in a Cisco Unified Computing System? (Choose two.)

A. A locale without any organizations, allows unrestricted access to system resources in all organizations.
B. When a user has both local and remote accounts, the roles defined in the remote user account override those in the local user account.
C. A role contains a set of privileges which define the operations that a user is allowed to take.
D. Customized roles can be configured on and downloaded from remote AAA servers.
E. The logical resources, pools and policies, are grouped into roles.

Answer: C,E

QUESTION 4 – (Topic 1)
Which actions must be taken in order to connect a NetApp FCoE storage system to a Cisco UCS system?

A. Ensure that the Fibre Channel switching mode is set to Switching, and use the Fibre Channel ports on the Fabric Interconnects.
B. Ensure that the Fibre Channel switching mode is set to Switching, and reconfigure the port to a FCoE Storage port.
C. Ensure that the Fibre Channel switching mode is set to End-Host, and use the Ethernet ports on the Fabric interconnects.
D. Ensure that the Fibre Channel switching mode is set to Switching, and use the Ethernet ports on the Fabric Interconnects.

Answer: A

QUESTION 5 – (Topic 1)
Which two protocols are accepted by the Cisco UCS Manager XML API? (Choose two.)

A. SMASH
B. HTTPS
C. HTTP
D. XMTP
E. SNMP

Answer: A,E

QUESTION 6 – (Topic 1)
An Cisco UCS Administrator is planning to complete a firmware upgrade using Auto install. Which two options are prerequisites to run Auto Install? (Choose two.)

A. minor fault fixing
B. configuration backup
C. service profiles unmounted from the blade servers
D. time synchronization
E. fault suppression started on the blade servers

Answer: A,B

QUESTION 7 – (Topic 1)
Which two prerequisites are required to configure a SAN boot from the FCoE storage of a Cisco UCS system? (Choose two.)

A. The Cisco UCS domain must be able to communicate with the SAN storage device that hosts the operating system image.
B. A boot policy must be created that contains a local disk, and the LVM must be configured correctly.
C. There must be iVR-enabled FCoE proxying between the Cisco UCS domain and the SAN storage device that hosts the operating system image.
D. There must be a boot target LUN on the device where the operating system image is
located.
E. There must be a boot target RAID on the device where the operating system image is located.

Answer: C,D

Click here to view complete Q&A of 300-175 exam
Certkingdom Review
, Certkingdom pdf torrent

MCTS Training, MCITP Trainnig

Best Cisco 300-175 Certification, Cisco 300-175 Training at certkingdom.com

 

C2150-210 IBM Security Identity Governance Fundamentals V5.1

0

Test information:
Number of questions: 47
Time allowed in minutes: 90
Required passing score: 58%
Languages: English, French, Latin American Spanish, Portuguese (Brazil)

Related certifications:
IBM Certified Associate – Security Identity Governance V5.1

Certifications (13%)
Define certification dataset and campaign�
Define signoff options
Define supervisor and reviewer activities
Define notification configuration�

Role Management (9%)
Define role structure
Publish role and define visibility
Consolidate role

Role Mining (15%)
Load Access Optimizer data
Create Role Mining session
Analyse statistics charts to identify candidate role
Analyse assignment map to identify candidate role
Analyse entitlement and user coverage to identify candidate role�
Leverage candidate role in IAG warehouse

Role Maintanence and Health (6%)
Identify unused roles
Retire role
Setup Role Certification campaign

Reporting (13%)
Identify standard report
Customize report layout
Configure scope visibility
customize query and add filter criteria
configure authorization to report for selected users

Separation of Duties (17%)
Define Business Activities
Define SoD Policy
Define Technical Transformation
Analyse Risk Violations
Define Mitigation Controls
Setup Risk Violation Certification Campaign

Installation (9%)
Prepare database server and schema
Configure virtual machine
Install virtual appliance
Configure database connections

Enterprise Integration (4%)
Identity ISIM and ISIG integration options
Identify supported connectors

ISIG Authorization Model (9%)
Define functional authorization for ISIG users
Restrict the data portion for a functional authorization
Define and use Attribute Groups

Access Request Management (9%)
Identify common process activities
Identify UI customization options
Review access request status

IBM Certified Associate – Security Identity Governance V5.1

Job Role Description / Target Audience
An IBM Certified Associate – Security Identity Governance V5.1 is an individual with entry level knowledge and experience with IBM Security Identity Governance V5.1 . This individual is knowledgeable about the fundamental concepts of IBM Security Identity Governance V5.1 through hands on experience. The associate should have an in-depth knowledge of the basic to intermediate tasks required in day-to-day use of IBM Security Identity Governance V5.1 . The individual should be able to complete these tasks with little to not assistance from documentation, peers or support.

Key Areas of Competency
IBM Security Identity Governance UI from an admin and end user perspective
Identify the key ISIG features
Understand the benefits of using ISIG for identity and access governance.

Recommended Prerequisite Skills
Working end user knowledge of IBM Security Identity Governance V5.1
Understand Identity Governance, Risk and Compliance (GRC) infrastructure such as audit, reporting, access
review, and certification.
Experience with role modeling and role mining
Experience with role healthcare and maintenance.
Understand the ISIG entitlement model and how to leverage it to build target application authorization models.
Understand the ISIG authorization model and access governance responsibilities.
Experience performing an RFP in the access governance space.
Understand business activity-based separation of duties modeling for better business and auditor readability.
Understand typical functionality of access request workflows such as manager approvals.

Requirements
This certification requires 1 test(s).

Click here to view complete Q&A of C2150-210 exam
Certkingdom Review
, Certkingdom C2150-210 PDF

 

MCTS Training, MCITP Trainnig

 

Best IBM C2150-210 Certification, IBM C2150-210 Training at certkingdom.com

70-773 Analyzing Big Data with Microsoft R

0

Exam 70-773
Analyzing Big Data with Microsoft R

Published: January 3, 2017
Languages: English
Audiences: Data scientists
Technology Microsoft R Server, SQL R Services
Credit toward certification: MCP, MCSE

Skills measured
This exam measures your ability to accomplish the technical tasks listed below. View video tutorials about the variety of question types on Microsoft exams.

Please note that the questions may test on, but will not be limited to, the topics described in the bulleted text.

Do you have feedback about the relevance of the skills measured on this exam? Please send Microsoft your comments. All feedback will be reviewed and incorporated as appropriate while still maintaining the validity and reliability of the certification process. Note that Microsoft will not respond directly to your feedback. We appreciate your input in ensuring the quality of the Microsoft Certification program.

If you have concerns about specific questions on this exam, please submit an exam challenge.

If you have other questions or feedback about Microsoft Certification exams or about the certification program, registration, or promotions, please contact your Regional Service Center.

Read and explore big data
Read data with R Server
Read supported data file formats, such as text files, SAS, and SPSS; convert data to XDF format; identify trade-offs between XDF and flat text files; read data through Open Database Connectivity (ODBC) data sources; read in files from other file systems; use an internal data frame as a data source; process data from sources that cannot be read natively by R Server
Summarize data
Compute crosstabs and univariate statistics, choose when to use rxCrossTabs versus rxCube, integrate with open source technologies by using packages such as dplyrXdf, use group by functionality, create complex formulas to perform multiple tasks in one pass through the data, extract quantiles by using rxQuantile
Visualize data
Visualize in-memory data with base plotting functions and ggplot2; create custom visualizations with rxSummary and rxCube; visualize data with rxHistogram and rxLinePlot, including faceted plots

Process big data
Process data with rxDataStep
Subset rows of data, modify and create columns by using the Transforms argument, choose when to use on-the-fly transformations versus in-data transform trade-offs, handle missing values through filtering or replacement, generate a data frame or an XDF file, process dates (POSIXct, POSIXlt)
Perform complex transforms that use transform functions
Define a transform function; reshape data by using a transform function; use open source packages, such as lubridate; pass in values by using transformVars and transformEnvir; use internal .rx variables and functions for tasks, including cross-chunk communication
Manage data sets
Sort data in various orders, such as ascending and descending; use rxSort deduplication to remove duplicate values; merge data sources using rxMerge(); merge options and types; identify when alternatives to rxSort and rxMerge should be used
Process text using RML packages
Create features using RML functions, such as featurizeText(); create indicator variables and arrays using RML functions, such as categorical() and categoricalHash(); perform feature selection using RML functions

Build predictive models with ScaleR
Estimate linear models
Use rxLinMod, rxGlm, and rxLogit to estimate linear models; set the family for a generalized linear model by using functions such as rxTweedie; process data on the fly by using the appropriate arguments and functions, such as the F function and Transforms argument; weight observations through frequency or probability weights; choose between different types of automatic variable selections, such as greedy searches, repeated scoring, and byproduct of training; identify the impact of missing values during automatic variable selection
Build and use partitioning models
Use rxDTree, rxDForest, and rxBTrees to build partitioning models; adjust the weighting of false positives and misses by using loss; select parameters that affect bias and variance, such as pruning, learning rate, and tree depth; use as.rpart to interact with open source ecosystems
Generate predictions and residuals
Use rxPredict to generate predictions; perform parallel scoring using rxExec; generate different types of predictions, such as link and response scores for GLM, response, prob, and vote for rxDForest; generate different types of residuals, such as Usual, Pearson, and DBM
Evaluate models and tuning parameters
Summarize estimated models; run arbitrary code out of process, such as parallel parameter tuning by using rxExec; evaluate tree models by using RevoTreeView and rxVarImpPlot; calculate model evaluation metrics by using built-in functions; calculate model evaluation metrics and visualizations by using custom code, such as mean absolute percentage error and precision recall curves
Create additional models using RML packages
Build and use a One-Class Support Vector Machine, build and use linear and logistic regressions that use L1 and L2 regularization, build and use a decision tree by using FastTree, use FastTree as a recommender with ranking loss (NDCG), build and use a simple three-layer feed-forward neural network

Use R Server in different environments
Use different compute contexts to run R Server effectively
Change the compute context (rxHadoopMR, rxSpark, rxLocalseq, and rxLocalParallel); identify which compute context to use for different tasks; use different data source objects, depending on the context (RxOdbcData and RxTextData); identify and use appropriate data sources for different data sources and compute contexts (HDFS and SQL Server); debug processes across different compute contexts; identify use cases for RevoPemaR
Optimize tasks by using local compute contexts
Identify and execute tasks that can be run only in the local compute context, identify tasks that are more efficient to run in the local compute context, choose between rxLocalseq and rxLocalParallel, profile across different compute contexts
Perform in-database analytics by using SQL Server
Choose when to perform in-database versus out-of-database computations, identify limitations of in-database computations, use in-database versus out-of-database compute contexts appropriately, use stored procedures for data processing steps, serialize objects and write back to binary fields in a table, write tables, configure R to optimize SQL Server ( chunksize, numtasks, and computecontext), effectively communicate performance properties to SQL administrators and architects (SQL Server Profiler)
Implement analysis workflows in the Hadoop ecosystem and Spark
Use appropriate R Server functions in Spark; integrate with Hive, Pig, and Hadoop MapReduce; integrate with the Spark ecosystem of tools, such as SparklyR and SparkR; profile and tune across different compute contexts; use doRSR for parallelizing code that was written using open source foreach
Deploy predictive models to SQL Server and Azure Machine Learning
Deploy predictive models to SQL Server as a stored procedure, deploy an arbitrary function to Azure Machine Learning by using the AzureML R package, identify when to use DeployR


Question No : 1

Note: This question Is part of a series of questions that use the same or similar answer choice. An answer choice may be correct for more than one question in the series. Each question is independent of the other questions in this series. Information and details provided In a question apply only to that question. You need to evaluate the significance of coefficient that are produced by using a model that
was estimated already.

Which function should you use?

A. rxPredict
B. rxLogit
C. Summary
D. rxLinMod
E. rxTweedie
F. stepAic
G. rxTransform
H. rxDataStep

Answer: D

Explanation: https://docs.microsoft.com/en-us/r-server/r/how-to-revoscaler-linear-model


Question No : 2

You need to build a model that looks at the probability of an outcome. You must regulate between L1 and L2. Which classification method should you use?

A. Two-Class Neural Network
B. Two-Class Support Vector Machine
C. Two-Class Decision Forest
D. Two-Class Logistic Regression

Answer: A

Click here to view complete Q&A of 70-773 exam
Certkingdom Review

MCTS Training, MCITP Trainnig

Best Microsoft MCP 70-773 Certification, Microsoft 70-773 Training at certkingdom.com

C2090-930 IBM SPSS Modeler Professional v3

0

Test information:
Number of questions: 60
Time allowed in minutes: 90
Required passing score: 67%
Languages: English, Japanese

Related certifications:
IBM Certified Specialist – SPSS Modeler Professional v3

This test will certify that the successful candidate has the fundamental knowledge to participate as an effective team member in the implementation of IBM SPSS Modeler Professional analytics solutions.

SPSS Modeler Professional Functionality (10%)
Identify the purpose of each palette
Describe the use of SuperNodes
Describe the advantages of SPSS Modeler scripting

Business Understanding and Planning (10%)
Describe the CRISP-DM process
Describe how to map business objectives to data mining goals

Data Understanding (15%)
Describe appropriate nodes for summary statistics, distributions, and visualizations (for example, graph nodes, output nodes)
Describe data quality issues (for example, outliers and missing data)

Data Preparation (20%)
Describe methods for data transformation (for example, Derive node, Auto Data Prep node, Data Audit node and Filler node)
Describe how to integrate data (for example, Merge node and Append node)
Describe sampling, partitioning, and balancing data (for example, Sample node, Balance node and Partition node)
Describe methods for refining data (for example, Select node, Filter node and Aggregate node)

Modeling (20%)
Describe classification models (including GLM and regression)
Describe segmentation models
Describe association models
Describe auto modeling nodes
Demonstrate how to combine models using the Ensemble node

Evaluation and Analysis (15%)
Demonstrate how to interpret SPSS Modeler results (for example, using Evaluation node, Analysis node, and data visualizations)
Describe how to use model nugget interfaces

Deployment (10%)
Describe how to use Export nodes (tools for exporting data)
Identify how to score new data using models
Identify SPSS Modeler reporting methods

IBM Certified Specialist – SPSS Modeler Professional v3

Job Role Description / Target Audience
The candidate has knowledge of analytical solutions, understands IBM SPSS Modeler capabilities, has knowledge of the IBM SPSS Modeler data model, can apply consistent methodologies to every engagement and develop SPSS predictive models.

To achieve the IBM Certified Specialist – SPSS Modeler Professional certification, candidates must possess the skills identified under Recommended Prerequisite Skills, if any, and pass one (1) exam.

Upon completion of this technical certification the successful candidate shows having the fundamental knowledge to participate as an effective team member in the implementation of IBM SPSS Modeler Professional analytics solution.

Recommended Prerequisite Skills
The following are topics that are assumed before your test preparation and will not be tested on :
Database and ODBC concepts
Basic proficiency in statistical concepts
Knowledge of basic computer programming

QUESTION 1
You have collected data about a set of patients, all of whom suffered from the same illness. During their course of treatment, each patient responded to one of five medications. The column. Drug, is a character field that describes the medication. You need to find out which proportion of the patients responded to each drug.
Which node should be used?

A. Web node
B. Distribution node
C. Sim Fit node
D. Evaluation node

Answer: C


QUESTION 2
When describing data, which two nodes address value types? (Choose two.)

A. Data Audit node
B. Statistics node
C. Type node
D. Report node

Answer: A,C


QUESTION 3
How many stages are there in the CRISP-DM process model?

A. 4
B. 6
C. 8
D. 10

Answer: C


QUESTION 4
An organization wants to determine why they are losing customers.
Which supervised modeling technique would be used to accomplish this task?

A. PCA
B. QUEST
C. Apriori
D. Kohonen

Answer: C


QUESTION 5
You want to create a Filter node to keep only a subset of the variables used in model building, based on predictor importance.
Which menu in the model nugget browser provides this functionality?

A. File
B. Preview
C. View
D. Generate

Answer: C

Click here to view complete Q&A of C2090-930 exam
Certkingdom Review
, Certkingdom C2090-930 PDF

 

MCTS Training, MCITP Trainnig

 

Best IBM C2090-930 Certification, IBM C2090-930 Training at certkingdom.com

 

C2090-913 Informix 4GL Development

0

Test information:
Number of questions: 90
Time allowed in minutes: 90
Required passing score: 78%
Languages: English

Related certifications:
IBM Certified Solutions Expert — Informix 4GL Developer

If you are a knowledgeable Informix 4GL Developer and are capable of performing the intermediate to advanced skills required to design and develop Informix database applications, you may benefit from this certification role.

Section 1 – Informix 4GL (18%)

Section 2 – Statements (28%)

Section 3 – Cursors and Memory (13%)

Section 4 – Creating a Help File: The mkmessage Utility (1%)

Section 5 – Creating a Report Driver (3%)

Section 6 – Defining Program Variables (3%)

Section 7 – Displaying Forms and Windows (4%)

Section 8 – Forms that use Arrays (4%)

Section 9 – Passing Values between Functions (6%)

Section 10 – procedural Logic (1%)

Section 11 – The REPORT Functions (3%)

Section 12 – The SQLCA Record (6%)

IBM Certified Solutions Expert — Informix 4GL Developer

Job Role Description / Target Audience
If you are a knowledgeable Informix 4GL Developer and are capable of performing the intermediate to advanced skills required to design and develop Informix database applications, you may benefit from this certification role.

To attain the IBM Certified Solutions Expert – Informix 4GL Developer certification, candidates must pass 1 test.

Recommended Prerequisite Skills
Significant experience as an Informix 4GL Developer.

 


QUESTION 1
Which parts of the DISPLAY ARRAY statement are always required?

A. ON KEY keywords
B. screen array name
C. program array name
D. END DISPLAY keywords
E. DISPLAY ARRAY keywords
F. BEFORE DISPLAY keywords

Answer: B,C,E

Explanation:


QUESTION 2
What can the arr_count() library function be used to determine?

A. the current position in the screen array
B. the current position in the program array
C. the number of elements in the screen array
D. the number of elements in the program array

Answer: D

Explanation:


QUESTION 3
Which features are unique to the INPUT ARRAY statement?

A. BEFORE/AFTER ROW clause
B. BEFORE/AFTER INPUT clause
C. BEFORE/AFTER FIELD clause
D. BEFORE/AFTER DELETE clause
E. BEFORE/AFTER INSERT clause

Answer: A,D,E

Explanation:

Explanation:

Click here to view complete Q&A of C2090-913 exam
Certkingdom Review
, Certkingdom C2090-913 PDF

MCTS Training, MCITP Trainnig

Best IBM C2090-913 Certification, IBM C2090-913 Training at certkingdom.com

C2090-719 InfoSphere Warehouse V9.5

0

Test information:
Number of questions: 60
Time allowed in minutes: 90
Required passing score: 65%
Languages: English, Japanese

Related certifications:
IBM Certified Solution Designer – InfoSphere Warehouse V9.5

This certification exam certifies that the successful candidate has important knowledge, skills, and abilities necessary to perform the intermediate and advanced skills required to design, develop, and support InfoSphere Warehouse V9.5 applications.

Section 1 – Architecting Warehouse Solutions (15%)
Demonstrate knowledge of InfoSphere Warehouse architecture and components
Editions
Software Components (why/when to use)
Describe the InfoSphere Warehouse building life-cycle
Steps to build and deploy the application(s)

Section 2 – Implementation (Table Ready) (5%)
Describe hardware topologies
Given a scenario, demonstrate how to implement security considerations

Section 3 – Physical Data Modeling (15%)
Given a scenario, demonstrate knowledge of the modeling process and the Design Studio features used
Identify physical design methods
Compare and synchronize
Impact analysis
Components
Enhancing the model
Given a scenario, describe range/data partitioning considerations
When is it appropriate to use
Cost

Section 4 – Cubing Services (CS) (20%)
Demonstrate knowledge of Cubing Services components
Cube server
Design Studio
MQT administration
Given a scenario, describe CS tooling and access methods
Demonstrate knowledge of CS optimization advisor
Identify the steps in creating a CS OLAP cube
Metadata
Creation of cube model and cube
Demonstrate knowledge of CS administration
Deploying cubes to cube server
Deploying cubes across multiple servers
Caching

Section 5 – Data Mining/Unstructured Text Analytics (12%)
Given a scenario, demonstrate knowledge of data mining and unstructured text analytics in InfoSphere Warehouse V9.5
Given scenario, describe the InfoSphere Intelligent Miner methods and how to use them
The mining process
Modeling
Scoring
Visualization
Demonstrate how to use Design Studio to implement mining methods
Mining unstructured text data – what do you do with it after it is extracted
Describe the unstructured text analytic information extraction process
Using JAVA regular expressions
Dictionary

Section 6 – SQL Warehousing Tool (SQW) (20%)
Demonstrate knowledge of SQW components
Data flows
Control flows
Mining flows
Variables
Versioning
Describe SQW anatomy
Operators
Ports
Connectors
Given a scenario, describe the SQW debugging functions

Section 7 – Run-time Administration and Monitoring of the Warehouse (13%)
Identify the application preparation steps for deployment
Describe the InfoSphere Warehouse components managed by Admin console
Demonstrate knowledge of managing, monitoring, and scheduling processes in Admin console
Given a scenario, demonstrate knowledge of workload management and monitoring
Difference between workload and classes
Controlling types of queries
Performance Expert

IBM Certified Solution Designer – InfoSphere Warehouse V9.5

Job Role Description / Target Audience
This certification exam certifies that the successful candidate has important knowledge, skills, and abilities necessary to perform the intermediate and advanced skills required to design, develop, and support InfoSphere Warehouse V9.5 applications. Applicable roles include: Solutions Architect, Data Warehouse Developers, and Database Administrator (in a data warehousing environment)

Requirements
This certification requires 1 test(s).

Test(s) required:
Click on the link(s) below to see test details, test objectives, suggested training and sample tests.

Test C2090-719 – InfoSphere Warehouse V9.5

QUESTION 1
What are two reasons for a combination of database and front-end tool based analytic
architectures in a data warehouse implementation? (Choose two.)

A. Less data is moved across the network, making queries run faster.
B. The database can provide consistent analytic calculations and query speed for common queries.
C. The combination of architectures will ensure fast query performance.
D. Multidimensional queries cannot be processed in SQL by the database engine so it must be done using a front-end tool.
E. The front-end tool allows for additional and more complex algorithms specific to applications that use that tool.

Answer: B,E

Explanation:


QUESTION 2
After deploying an application, you might need to update it by making changes to one or more
data flows. Deploying changes to an existing application is called delta deployment. How do you
package changes using delta deployment?

A. Package only the operator or property that has changed.
B. Package the data flow that has changed.
C. Package the control flow.
D. Package all the items that were originally packaged and use the same profile that was used.

Answer: C

Explanation:


QUESTION 3
You are implementing a DB2 Workload Manager (WLM) schema to limit the number of load
utilities that can execute concurrently. Which WLM object would be used to accomplish this?

A. work class with an associated work action and an appropriate threshold
B. workload with an associated service class and an appropriate threshold
C. work class with an associated service class and an appropriate threshold
D. workload with an associated work action and an appropriate threshold

Answer: A

Explanation:


QUESTION 4
Several operators are defined and linked together in DataFlow1. Another set of operators make up
DataFlow2. A control flow is defined and both DataFlow1 and DataFlow2 are used. You require
that DataFlow1 dynamically change the variable values used in DataFlow2. How can you fulfill this
requirement?

A. The inherent design of the SQL Warehouse Tool is that any variable value changed in one data
flow is accessible by any other data flow as long as the data flows are defined in the same warehouse project.
B. Using the File Export operator, DataFlow1 writes a file that contains updated variable values.
DataFlow2 accesses those updated variable values by reading that same file using an Import File operator.
C. When a control flow is executed, a run profile provides the initial values for all variables. Once
those values are set in the run profile, they are in affect for the entire execution of the control flow.
D. Using the File Export operator, DataFlow1 writes a file, containing updated variable values. A
variable assignment operator is then used to assign the values in the file to the appropriate
variables. DataFlow2 then has access to the updated variable values.

Answer: D

Explanation:


QUESTION 5
Relational database and a database model that is often a star or snowflake schema are
characteristics of which engine storage structure?

A. MOLAP
B. ROLAP
C. Multidimensional cubing
D. Proprietary

Answer: B

 

Click here to view complete Q&A of C2090-719 exam
Certkingdom Review
, Certkingdom C2090-719 PDF

MCTS Training, MCITP Trainnig

Best IBM C2090-719 Certification, IBM C2090-719 Training at certkingdom.com

C2090-645 IBM Cognos 10 BI Multidimensional Author

0

Test information:
Number of questions: 57
Time allowed in minutes: 60
Required passing score: 75%
Languages: English, Japanese

Related certifications:
IBM Certified Designer – Cognos 10 BI Multidimensional Reports
IBM Certified Solution Expert – Cognos BI

The Cognos 10 BI Multidimensional Author Exam covers key concepts, technologies, and functionality of the Cognos products. In preparation for an exam, we recommend a combination of training and hands-on experience, and a detailed review of product documentation.

Dimensional Data Components (28%)
Distinguish between relational, DMR, and dimensional data sources
Identify dimensional data items and expressions
Define multidimensional data structure components
Describe the importance of report context
Identify the default measure of a report
Describe default members and their purpose
Describe what a MUN is and identify the impact of using the wrong MUN
Describe what a set is
Describe what a tuple is

Focus Reports (14%)
Distinguish between dimensional and relational filtering styles
Identify techniques to focus data using the dimensional style
Interpret data that is focused based on members
Interpret data that is filtered based on measure values
Describe the purpose of a slicer

Drilling in Reports (14%)
Describe default drill-down behavior
Describe default drill-up behavior
Describe cases for advanced drilling configuration
Appraise reports generated with calculations that are preserved during drilling
Describe how member sets work

Drill-through Access (8%)
Identify supported drill-through data item combinations
Set-up drill-through access
Describe a conformed dimension

Calculations and Dimensional Functions (36%)
Describe the use of arithmetic operations in queries
Analyze the use of dimensional functions in queries
Examine coercion
Apply prompts to focus reports
Compose complex expressions that combine and reuse existing expressions


QUESTION 1
To display all individual campaigns in a crosstab report, a report author could use the expression set([TrailChef Campaign],[EverGlow Campaign],[Course Pro Campaign]). Instead, the report author decides to use the parent member of the campaigns in the set expression “children([All Campaigns])”. Which statement is true about the method that was used?

A. In the future, when a campaign is deleted or new ones are added, the report author must modify the expression.
B. In the future, when a campaign is deleted or new ones are added, the unmodified expression will be valid.
C. The report author should not have used the method chosen, as the first method is best
in this situation.
D. To be accurate, the report author should avoid using a set expression.

Answer: B


QUESTION 2
Which of the following statements is correct about the order function?

A. The currentMeasure function must be used with the order function as the sort by criterion.
B. It arranges members of all sets in the report by ascending or descending values.
C. Optional parameters allow the author to order the members of a hierarchy without regard of their level.
D. It arranges members of a set alphabetically by ascending or descending captions.

Answer: C


QUESTION 3
A report author is working with an OLAP data source. The report author creates a query that uses a caption function on a member and applies a string function. What is a possible consequence of this action?

A. Using these dimensional methods will not work with an OLAP data source.
B. The mapped string values will not pass through to the target report.
C. There is nothing wrong with this approach.
D. Mixing dimensional styles and relational styles in a single query can create unexpected results.

Answer: D


QUESTION 4
When must a report author use the caption function?

A. As the first parameter of the roleValue function.
B. To return the display name for the specified business key.
C. To see the string display name for the specified element.
D. To pass the returned value to a drill-through target report, this expects a matching string as a parameter value.

Answer: D


QUESTION 5
Instead of prompting the user to select any countries in Europe, the report author wants to constrain the user to select one or more countries from the Northern Europe region. What kind of prompt should be used and how can this be achieved?

A. This is not possible because a prompt must always be populated with all members of a level.
B. Create a multi-select value prompt. Populate it using an expression on the [Northern Europe] member to retrieve its children on the country level.
C. Generate a prompt by creating an expression with a parameter on the crosstab edge: children([Northern Europe]->?Country?
D. Create a tree prompt, and populate it using an expression on the [Northern Europe]
member to retrieve its children at the country level.

Answer: B

 

Click here to view complete Q&A of C2090-645 exam
Certkingdom Review
, Certkingdom C2090-645 PDF

MCTS Training, MCITP Trainnig

Best IBM C2090-645 Certification, IBM C2090-645 Training at certkingdom.com

Exam 70-694 Virtualizing Enterprise Desktops and Apps

0

Published: January 8, 2015
Languages: English
Audiences: IT professionals
Technology: Windows 8.1, Windows Server 2012 R2, Microsoft Intune
Credit toward certification: MCP

Skills measured
This exam measures your ability to accomplish the technical tasks listed below. The percentages indicate the relative weight of each major topic area on the exam. The higher the percentage, the more questions you are likely to see on that content area on the exam. View video tutorials about the variety of question types on Microsoft exams.

Please note that the questions may test on, but will not be limited to, the topics described in the bulleted text.

Do you have feedback about the relevance of the skills measured on this exam? Please send Microsoft your comments. All feedback will be reviewed and incorporated as appropriate while still maintaining the validity and reliability of the certification process. Note that Microsoft will not respond directly to your feedback. We appreciate your input in ensuring the quality of the Microsoft Certification program.

If you have concerns about specific questions on this exam, please submit an exam challenge.

If you have other questions or feedback about Microsoft Certification exams or about the certification program, registration, or promotions, please contact your Regional Service Center.

Plan app virtualization (27%)
Design an app distribution strategy
Design considerations, including impact on clients, offline access, deployment infrastructure, and remote locations; plan for updates to apps
Plan and implement app compatibility
Configure and implement Microsoft Assessment and Planning (MAP) Toolkit; planning considerations, including Remote Desktop Services (RDS), Virtual Desktop Infrastructure (VDI), client Hyper-V, and Application Compatibility Toolkit (ACT); plan for application version co-existence
Update apps in desktop images
Configure online servicing, apply patches offline, configure offline virtual machine (VM) servicing, update Microsoft Deployment Toolkit (MDT) task sequences

Implement app virtualization (25%)
Configure App-V
Configure a new application, configure a Connection Group, configure App-V reporting on the client, create a report for App-V
Deploy App-V clients
Install and test the App-V client; configure the App-V client; configure the App-V client, by using Group Policy
Configure apps sequencing
Install the App-V Sequencer, deploy sequenced apps, update sequenced apps, publish Office 2013 and Sequencing Add-On for Word 2013, deploy connected apps

Plan and implement virtual desktops (21%)
Plan for pooled and personal desktops
Planning considerations, including shared storage, network, Storage Spaces, and scale-out file servers; plan capacity
Implement virtual desktop collections
Configure collections type, VM creation, and user assignments; configure client settings; configure Active Directory permissions
Plan and implement Remote Desktop Services (RDS)
Install and configure Remote Desktop Session Host, install and configure the Remote Desktop Web Access (RD Web Access) role, configure the Remote Desktop Connection Broker (RD Connection Broker) for the Remote Desktop Session Host, perform capacity analysis
Create and configure remote applications
Prepare Remote Desktop Session Hosts for application installation; configure RemoteApp properties; create a RemoteApp distribution file (MSI or RDP); sign packages with certificates; implement application version co-existence, by using RD Web Access; configure file extension associations
Deploy and manage remote applications
Configure RemoteApp and Desktop Connections settings, configure GPOs for signed packages, configure RemoteApp for Hyper-V, export and import RemoteApp configurations, deploy a RemoteApp distribution file (MSI or RDP)

Plan and implement business continuity for virtualized apps (27%)
Plan and implement a resilient Remote Desktop infrastructure
Design highly available RD Web Access, RD Connection Broker, and Remote Desktop Gateway; perform backup and recovery of the Remote Desktop Licensing server; configure VM or dedicated farm redirection
Plan and implement business continuity for virtual desktops
Design and implement Hyper-V Replica with Hyper-V Replica Broker, design and implement business continuity for personal and shared desktop collections
Plan and implement a resilient virtual app delivery infrastructure
Plan and implement highly available App-V data store and management server; implement pre-populated/shared cache App-V functionality for the VDI environment; implement highly available content share; implement a branch office strategy, using App-V; manage VM backups


QUESTION 1
You need to meet the application requirements for Server1 and Server2.
What are two possible ways to achieve this goal? Each correct answer presents a complete solution.

A. Create a selection profile.
B. Create a deployment database.
C. Create a linked deployment share.
D. Modify the deployment share properties.
E. Create a new image group.

Correct  Click here to view the answers

Explanation
Explanation/Reference:


QUESTION 2
Which software should you install on Server2 to support the planned changes?

A. SQL Server 2012 Express
B. WDS
C. Windows Assessment and Deployment Kit (Windows ADK)
D. Microsoft .NET Framework 3.5

Correct  Click here to view the answers

Explanation/Reference:


QUESTION 3
You need to meet the application requirements for the client computers of the managers.
What should you do?

A. Create a customization file named Custom.msp. Copy Custom.msp to \\Server4 \Software\Updates. Run \\Server4\Software\Setup.exe without specifying any parameters.
B. Create a customization file named Custom.xml. Copy Custom.xml to \\Server4 \Software\Proplus.ww. Run \\Server4\Software\Setup.exe without specifying any parameters.
C. Create a customization file named Custom.xml. Copy Custom.xml to \\Server4 \Software\Updates. Run \\Server4\Software\Setup.exe without specifying any parameters.
D. Create a customization file named Answer.xml. Copy Answer.xml to the managers’ computers. Run \\Server4\Software\Setup.exe and specify the /admin parameter.

Correct  Click here to view the answers

Explanation/Reference:

Click here to view complete Q&A of 70-694 exam
Certkingdom Review
, Certkingdom PDF 70-694

MCTS Training, MCITP Trainnig

Best Microsoft Office Specialist 70-694 Certification, Microsoft 70-694 Training at certkingdom.com

Go to Top