10 02 2015
Exam 70-463 Dumps New Version Published Today With Latest Added Questions From Microsoft Exam Center! (211-225)
2015 Timesaving Comprehensive Guides For Microsoft 70-463 Exam: Using Latst Released Braindump2go 70-463 Practice Tests Questions, Quickly Pass 70-463 Exam 100%! Following Questions and Answers are all the New Published By Microsoft Official Exam Center!
Vendor: Microsoft
Exam Code: 70-463
Exam Name: Implementing a Data Warehouse with Microsoft SQL Server 2012 Exam
QUESTION 211
You are developing a SQL Server Integration Services (SSIS) project with multiple packages to copy data to a Windows Azure SQL Database database.
An automated process must validate all related Environment references, parameter data types, package references, and referenced assemblies.
The automated process must run on a regular schedule.
You need to establish the automated validation process by using the least amount of administrative effort.
What should you do?
A. Use an event handler for OnError for the package.
B. Use an event handler for OnError for each data flow task.
C. Use an event handler for OnTaskFailed for the package.
D. View the job history for the SQL Server Agent job.
E. View the All Messages subsection of the All Executions report for the package.
F. Store the System::SourceID variable in the custom log table.
G. Store the System::ServerExecutionID variable in the custom log table.
H. Store the System::ExecutionInstanceGUID variable in the custom log table.
I. Enable the SSIS log provider for SQL Server for OnError in the package control flow.
J. Enable the SSIS log provider for SQL Server for OnTaskFailed in the package control flow.
K. Deploy the project by using dtutil.exe with the /COPY DTS option.
L. Deploy the project by using dtutil.exe with the /COPY SQL option.
M. Deploy the .ispac file by using the Integration Services Deployment Wizard.
N. Create a SQL Server Agent job to execute the SSISDB.catalog.validate_project stored
procedure.
O. Create a SQL Server Agent job to execute the SSISDB.catalog.validate_package stored
procedure.
P. Create a SQL Server Agent job to execute the SSISDB.catalog.create_execution and
SSISDB.catalog. start_execution stored procedures.
Q. Create a table to store error information. Create an error output on each data flow destination
that writes OnError event text to the table.
R. Create a table to store error information. Create an error output on each data flow destination
that writes OnTaskFailed event text to the table.
Answer: N
QUESTION 212
You are developing a SQL Server Integration Services (SSIS) project by using the Project Deployment Model. All packages in the project must log custom messages.
You need to produce reports that combine the custom log messages with the system generated log messages.
What should you do?
A. Use an event handler for OnError for the package.
B. Use an event handler for OnError for each data flow task.
C. Use an event handler for OnTaskFailed for the package.
D. View the job history for the SQL Server Agent job.
E. View the All Messages subsection of the All Executions report for the package.
F. Store the System::SourceID variable in the custom log table.
G. Store the System::ServerExecutionID variable in the custom log table.
H. Store the System::ExecutionInstanceGUID variable in the custom log table.
I. Enable the SSIS log provider for SQL Server for OnError in the package control flow.
J. Enable the SSIS log provider for SQL Server for OnTaskFailed in the package control flow,
K. Deploy the project by using dtutil.exe with the /COPY DTS option.
L. Deploy the project by using dtutil.exe with the /COPY SQL option.
M. Deploy the .ispac file by using the Integration Services Deployment Wizard.
N. Create a SQL Server Agent job to execute the SSISDB.catalog.validate_project stored
procedure.
O. Create a SQL Server Agent job to execute the SSISDB.catalog.validate_package stored
procedure.
P. Create a SQL Server Agent job to execute the SSISDB.catalog.create_execution and
SSISDB.catalog. start_execution stored procedures.
Q. Create a table to store error information. Create an error output on each data flow destination
that writes OnError event text to the table.
R. Create a table to store error information. Create an error output on each data flow destination
that writes OnTaskFailed event text to the table.
Answer: G
QUESTION 213
You are developing a SQL Server Integration Services (SSIS) package to implement an incremental data load strategy.
The package reads data from a source system that uses the SQL Server change data capture (CDC) feature.
You have added a CDC Source component to the data flow to read changed data from the source system.
You need to add a data flow transformation to redirect rows for separate processing of insert, update, and delete operations.
Which data flow transformation should you use?
A. Audit
B. Merge Join
C. Merge
D. CDC Splitter
Answer: D
Explanation:
The CDC splitter splits a single flow of change rows from a CDC source data flow into different data flows for Insert, Update and Delete operations
http://msdn.microsoft.com/en-us/library/hh758656.aspx
QUESTION 214
Drag and Drop Questions
A Data Flow task in a SQL Server Integration Services (SSIS) package produces run-time errors. You need to edit the package to log specific error messages.
Which three actions should you perform in sequence? (To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.)
Answer:
QUESTION 215
You need to extract data from delimited text files.
What connection manager type would you choose?
A. A Flat File connection manager
B. An OLE DB connection manager
C. An ADO.NET connection manager
D. A File connection manager
Answer: A
QUESTION 216
Some of the data your company processes is sent in from partners via email.
How would you configure an SMTP connection manager to extract files from email messages?
A. In the SMTP connection manager, configure the OperationMode setting to Send And
Receive.
B. It is not possible to use the SMTP connection manager in this way, because it can only be
used by SSIS to send email messages.
C. The SMTP connection manager supports sending and receiving email messages by default,
so no additional configuration is necessary.
D. It is not possible to use the SMTP connection manager for this; use the IMAP (Internet
Message Access Protocol) connection manager instead.
Answer: B
QUESTION 217
You need to extract data from a table in a SQL Server 2012 database.
What connection manager types can you use? (Choose all that apply.)
A. An ODBC connection manager
B. An OLE DB connection manager
C. A File connection manager
D. An ADO.NET connection manager
Answer: ABD
QUESTION 218
In your SSIS solution, you need to load a large set of rows into the database as quickly as possible.
The rows are stored in a delimited text file, and only one source column needs its data type converted from String (used by the source column) to Decimal (used by the destination column).
What control flow task would be most suitable for this operation?
A. The File System task would be perfect in this case, because it can read data from files and
can be configured to handle data type conversions.
B. The Bulk Insert task would be the most appropriate, because it is the quickest and can
handle data type conversions.
C. The data flow task would have to be used, because the data needs to be transformed before
it can be loaded into the table.
D. No single control flow task can be used for this operation, because the data needs to be
extracted from the source file, transformed, and then loaded into the destination table.
At least three different tasks would have to be used–the Bulk Insert task to load the data into
a staging database, a Data Conversion task to convert the data appropriately, and finally, an Execute SQL task to merge the transformed data with existing destination data.
Answer: C
QUESTION 219
A part of your data consolidation process involves extracting data from Excel workbooks.
Occasionally, the data contains errors that cannot be corrected automatically.
How can you handle this problem by using SSIS?
A. Redirect the failed data flow task to an External Process task, open the problematic Excel
file in Excel, and prompt the user to correct the file before continuing the data consolidation
process.
B. Redirect the failed data flow task to a File System task that moves the erroneous file to a dedicated location where an information worker can correct it later.
C. If the error cannot be corrected automatically, there is no way for SSIS to continue with the automated data consolidation process.
D. None of the answers above are correct.
Due to Excel’s strict data validation rules, an Excel file cannot ever contain erroneous data.
Answer: B
QUESTION 220
In your ETL process, there are three external processes that need to be executed in sequence, but you do not want to stop execution if any of them fails.
Can this be achieved by using precedence constraints? If so, which precedence constraints can be used?
A. No, this cannot be achieved just by using precedence constraints.
B. Yes, this can be achieved by using completion precedence constraints between the first and
the second and between the second and the third Execute Process tasks, and by using a
success precedence constraint between the third Execute Process task and the following
task.
C. Yes, this can be achieved by using completion precedence constraints between the first and
the second, between the second and the third, and between the third Execute Process task
and the following task.
D. Yes, this can be achieved by using failure precedence constraints between the first and the second, and between the second and the third Execute Process tasks, and by using a
completion precedence constraint between the third Execute Process task and the following
task.
Answer: B
QUESTION 221
You are administering SQL Server Integration Services (SSIS) permissions on a production server that runs SQL Server 2012.
Quality Assurance (QA) testers in the company must have permission to perform the following tasks on specific projects:
– View and validate projects and packages
– View Environments and Environment variables
– Execute packages
You need to grant the minimum possible privileges to the QA testers.
What should you do? (Each correct answer presents part of the solution. Choose all that apply.)
A. In the SSISDB database, add QA Tester logons to the ssis_admin role.
B. In the msdb database, add QA Tester logons to the db_ssisoperator role.
C. Grant Modify permission in the projects to the QA Tester logons.
D. Grant Read permission in the SSIS catalog folder, the projects, and the Environments to the
QA Tester logons.
E. Grant Execute permission in the projects to the QA Tester logons.
F. In the msdb database, add QA Tester logons to the db_ssisItduser role.
Answer: BD
QUESTION 222
You are designing a data warehouse that uses SQL Server 2012.
The data warehouse contains a table named factSales that stores product sales.
The table has a clustered index on the primary key, four foreign keys to dimension tables, and an aggregate column for sales totals.
All key columns use the int data type and the aggregate column uses the money data type.
You need to increase the speed of data retrieval from the factSales table.
Which index type should you add to the table?
A. Clustered
B. Semantic search
C. Nonclustered
D. XML
Answer: C
QUESTION 223
You are designing a complex SQL Server Integration Services (SSIS) project that uses the Project Deployment model.
The project will contain between 15 and 20 packages.
All the packages must connect to the same data source and destination.
You need to define and reuse the connection managers in all the packages by using the least development effort.
What should you do?
A. Copy and paste the connection manager details into each package.
B. Implement project connection managers.
C. Implement package connection managers.
D. Implement parent package variables in all packages.
Answer: B
QUESTION 224
You are performance tuning a SQL Server Integration Services (SSIS) package to load sales data from a source system into a data warehouse that is hosted on Windows Azure SQL Database.
The package contains a data flow task that has seven source-to-destination execution trees.
Only three of the source-to-destination execution trees are running in parallel.
You need to ensure that all the execution trees run in parallel.
What should you do?
A. Set the EngineThreads property of the data flow task to 7.
B. Set the MaxConcurrentExcecutables property of the package to 7.
C. Create seven data flow tasks that contain one source-to-destination execution tree each.
D. Place the data flow task in a For Loop container that is configured to execute seven times.
Answer: A
QUESTION 225
You are developing a SQL Server Integration Services (SSIS) package to implement an incremental data load strategy.
The package reads data from a source system that uses the SQL Server change data capture (CDC) feature.
You have added a CDC Source component to the data flow to read changed data from the source system.
You need to add a data flow transformation to redirect rows for separate processing of insert, update, and delete operations.
Which data flow transformation should you use?
A. DQS Cleansing
B. Merge Join
C. Pivot
D. Conditional Split
Answer: D
Braindump2go is one of the Leading 70-463 Exam Preparation Material Providers Around the World! We Offer 100% Money Back Guarantee on All Products! Feel Free In Downloading Our New Released 70-463 Real Exam Questions!
http://www.braindump2go.com/70-463.html
Using Braindump2go New Updated Microsoft 70-463 Practice Exam Questions – Getting Microsoft 70-463 Certification (201-210) New Published Microsoft 70-463 Exam Dumps Free Download from Braindump2go! (226-239)
Comments are currently closed.