Pages

Thursday, October 30, 2014

Reset Administrator password of DAC 10g

Issue: How to reset Administrator password in DAC10g(Oracle BI Data Warehouse Administration Console)

Solution: 

 There is no specific algorithm to decrypt DAC Administrator password which stored in W_ETL_USER metadata table of DAC.

 Searched many websites to get some solution and gone through the Oracle documents. But unfortunately no where I got the solution to reset the password.

 Here one interesting thing, DAC is also having others user accounts with "Developer", "Operator" roles. So I shouldn't mess existing user accounts to just reset the password.

 Let us look in to the trick now.

Step 1: Close DAC client.

Step 2: Take the W_ETL_USER table backup.

create table w_etl_user_bkp as select * from w_etl_user;

Step 3: Truncate the table W_ETL_USER.

truncate table w_etl_user;

Step 4: Open DAC client and try to login with below credentials

User name: Administrator
Password:   Administrator

Step 5: Now this will ask you to give new Administrator password. Just enter new password.

Step 6: Finally we will insert remaining users and their details from the backup table.

 insert into w_etl_user select * from w_etl_user_bkp where username!='Administrator';

 commit;

 Now you have the new Administrator password along with existing user accounts.




Wednesday, October 1, 2014

DAC10g to DAC11g upgrade


 In this post we will discuss Step by step Configurations to upgrade DAC10g version DAC11g.

Note: Upgrading to DAC 11g involves upgrading the DAC platform and repository, and then performing post upgrade tasks.

 Assumptions

•  DAC 10g version was already installed and running.
•  We are just upgrading the DAC platform from 10g to 11g compatible.

High level steps to upgrade

a) Perform DAC10g repository backup.
b) Install DAC11g.
c) Use the DAC11g Client to connect to the existing repository.
d) Perform the post upgrade tasks.

a) Perform DAC10g repository backup.

 i) Login to DAC10g with Administrator user credentials.

 ii) From the file menu go to "Tools" -> "DAC Repository Management"->"Export"

 iii) Select all the containers and select "Logical", "Run Time", "System" and "User Data".

 iv) Modify "Change import/export folder" to different folder than the default one.

b) Install DAC11g.

i)  Assuming DAC11g was installed already.

c) Use the DAC11g Client to connect to the existing repository.

i) Log in to the DAC 11g Client with Administrator user credentials, and make sure you select Authentication Type as "DAC".

ii) It will prompt to upgrade the repository, click Yes.

iii) This process will take several minutes. When the process is complete, the DAC Client will open.

iv) Now you should be able to see DAC10g metadata content in DAC11g format.

d) Perform the post upgrade tasks.

i) Configure the DAC repository to allow DAC Server connections.

ii) Configure connection between the DAC Server and repository.

Thursday, September 18, 2014

DAC 11g code migration information tabs

 In this post, we will discuss about the created patch and applied patch information.

 When you create a DAC metadata patch or apply a patch, this information will be stored in DAC setup tab for future references.

Created patch:

 Created patches will be stored in DAC -> Setup -> "Working Patches" tab.

DAC Metadata_working patches tab

 In this tab, you will get the information about, when the patch created,what are the contents in it, its status and last exported time.


Applied patch:

 Applied patches will be store in DAC -> Setup -> "Applied Patches" tab.

DAC Metadata_applied_patches tab


 In this tab, you will get all the applied patches information, what patches were applied and its status.

Tuesday, September 16, 2014

DAC metadata patch failing with errors

  I was working on creating DAC metadata patch to migrate subset of code to another environment. Previously i could migrate it successfully. But due to some mistake that I have done while creating patch produces below error message while applying the patch in another environment.

 So I created a patch again correctly and could resolve this issue.

1) , of table W_ETL_SA_STEP is not added or updated to repositoryMESSAGE:::Subject Area Task - Link must have [Task]
EXCEPTION CLASS::: com.siebel.analytics.etl.api.exception.RequiredPropertyMissingException

com.siebel.analytics.etl.api.object.SAStepObject.checkForRequiredProps(SAStepObject.java:157)
com.siebel.analytics.etl.api.object.BaseAPIObject.persist(BaseAPIObject.java:375)
com.siebel.analytics.etl.api.object.BaseAPIObject.persist(BaseAPIObject.java:296)
com.siebel.analytics.etl.inc.objects.IncXMLReader$PersistInfo.persist(IncXMLReader.java:1941)
com.siebel.analytics.etl.inc.objects.IncXMLReader$CurrTableInfo.persistCurrentObjects(IncXMLReader.java:2166)
com.siebel.analytics.etl.inc.objects.IncXMLReader$CurrTableInfo.addObject(IncXMLReader


2) , of table W_ETL_SA_STEP is not added or updated to repositoryMESSAGE:::persists error: null STEP_WID is not specified!
Can't update a record!
EXCEPTION CLASS::: com.siebel.analytics.etl.api.exception.BaseAPIException

com.siebel.analytics.etl.api.object.BaseAPIObject.persist(BaseAPIObject.java:405)
com.siebel.analytics.etl.api.object.BaseAPIObject.persist(BaseAPIObject.java:296)
com.siebel.analytics.etl.inc.objects.IncXMLReader$PersistInfo.persist(IncXMLReader.java:1941)
com.siebel.analytics.etl.inc.objects.IncXMLReader$CurrTableInfo.persistCurrentObjects(IncXMLReader.java:2166)
com.siebel.analytics.etl.inc.objects.IncXMLReader$CurrTableInfo.addObject(IncXMLReader.java:2126)
com.siebel.analytics.etl.inc.objects.IncXMLReader.createAPIObjectNew

3)

createAPIObject fails with error: null, container id =,Oracle 11.5.10EXCEPTION CLASS::: com.siebel.etl.gui.core.NewRecordCreationException

com.siebel.analytics.etl.client.data.model.ResultSetParser.createNewRecord(ResultSetParser.java:203)
com.siebel.analytics.etl.client.data.model.UpdatableDataTableModel.createNewRecord(UpdatableDataTableModel.java:98)
com.siebel.analytics.etl.client.data.model.DACTableModel.createNewRecord(DACTableModel.java:326)
com.siebel.analytics.etl.api.object.BaseAPIObject.persist(BaseAPIObject.java:361)
com.siebel.analytics.etl.api.object.BaseAPIObject.persist(BaseAPIObject.java:296)
com.siebel.analytics.etl.inc.objects.IncXMLReader$PersistInfo.persist(IncXMLReader.java:1941)
com.siebel.analytics.etl.inc.objects.IncXMLReader

Friday, September 12, 2014

Export Multiple Informatica workflows from different folders as single xml file

 Export Multiple Informatica workflows from different folders as single xml file

 I am sharing a technique where you can export multiple Informatica workflows from different folders. Usually you can export workflows from one folder at a time. But sometimes you need to export it as one xml file and this is not possible with regular approach. 

So for those people who got this requirement just follow this steps.

 I have already explained Usage of Informatica Query option, so by making use of this option we will complete this export activity.

Step 1: Create a query such that i will list out all the required workflows from different folders.


Informatica query tool for different workflows

Step 2: Execute the query to create the output. You will see the output with workflows from different folders as marked in red color in the below image.

Informatica query tool for different workflows list

Step 3:  Use ctrl(key) to select all the workflow and do right click on it and select "Export to XML file" option.

Informatica query tool for different workflows list export

Step 4: Now you can save all the workflows as single xml file from different workflows.

Informatica query tool for different workflows list export save

Related links:


Tuesday, September 9, 2014

./nqcmd: error while loading shared libraries: libARicu23.so:

 In this post I am going to explain how to use "nqcmd" command line utility in Linux environment.

Please follow the same steps as mentioned. Otherwise the while invoking nqcmd utility you will face some library file error issue.

./nqcmd: error while loading shared libraries: libARicu23.so: cannot open shared object file: No such file or directory

 As oracle says, to use nqcmd utility, we have to run "bi-init.sh" which initializes Oracle Instance. This file will be available in the below path.

 /OBIEE/app/fmw/instances/instance/bifoundation/OracleBIApplication/coreapplication/setup/

Step 1: Open putty and connect to the linux server and then change directory to the below path.(Path may be different according your environment)

cd /OBIEE/app/fmw/instances/instance/bifoundation/OracleBIApplication/coreapplication/setup/

Step 2: Run bi-init.sh file exactly as below.

. ./bi-init.sh

Note: First dot space and second dot and immediate backslash.

Step 3: Now change directory to the below path

cd /OBIEE/app/fmw/Oracle_BI1/bifoundation/server/bin

Step 4: Execute the nqcmd with below command and without any error it will invoke.

[obiee@obieedevlinux bin]$ ./nqcmd

 Here the Important step is Step 2: to set Oracle Instance.

Download OBIEE 11g server logs from Enterprise manager

 When we face some trouble in OBIEE 11g, we always search for the log files to find out the cause. 

Examples of log files are 

 1) Presentation Services log (sawlog.log)
 2) Server log (nqserver.log & nqquery.log)
 3) Schedule log (nqscheduler.log)
 4) JavaHost log (jh.log)
 5) Cluster Controller log (nqcluster.log)

 6) OPMN log (opmn.log)

 To get/view this files either you have to connect to the server using ssh(putty) or filezilla(ftp). Sometime you will not have server login passwords. So in this case getting these files are very difficult. 

 Here is the tip to download the above log files directly from OBIEE 11g Enterprise manager.

Step 1: Login to EM.

Weblogic EM Login

Step 2: Right click on the "bifoundation_domain", then "Logs" and select "View Log Messages".

weblogic EM log files

Step 3: You will see the log files screen as below.

weblogic EM target log files

Step 4: Expand "Selected Targets" as shown below and you will see the log files here.

weblogic EM target logs expand

Step 5: Then select desired server log and click on the "view log file" ( Marked with red color in the screenshot).

weblogic EM BI Server log example

Step 6: Here you can either view the required log or you can directly download the entire log file in to your local machine.

weblogic EM BI Server log download

 Similarly you can follow the same steps for downloading other log files also.

Monday, August 25, 2014

Why DAC 11g(Oracle BI Data Warehouse Administration Console) synchronization fails

 People who are working on DAC(Oracle BI Data Warehouse Administration Console), for the first time face this issue. Below are the exact points which will resolve the problem.

 In this post, you will get an explanation of why DAC 11g(Oracle BI Data Warehouse Administration Console) synchronization fails.

  1) If, PATH environment variable is not set for pmrep and pmcmd.
  2) If DAC installation directory contains spaces.(space character)
  3) Infa_int and Infa_rep connection should be successful.


1) If, PATH environment variable is not set for pmrep and pmcmd.

      
PMREP and PMCMD Path evnironment variable

D:\%SystemRoot%\system32;%SystemRoot%;%SystemRoot%\System32\Wbem;%SYSTEMROOT%\System32\WindowsPowerShell\v1.0\;C:\Program Files (x86)\Intel\OpenCL SDK\3.0\bin\x86;C:\Program Files (x86)\Intel\OpenCL SDK\3.0\bin\x64;D:\Informatica\9.1.0\server\bin;D:\Oracle11g\app\mvenkata\product\11.2.0\client_1\bin;C:\Program Files (x86)\Lenovo\Access Connections\;C:\Program Files (x86)\Common Files\lenovo\easyplussdk\bin


Symptom: Without the above environment variable, pmcmd and pmrep will not invoke as below.

pmcmd and pmrep is not recongized

  So after setting up the above environment variable close the command prompt and open it again. Verify pmcmd and pmrep commands as below. It will invoke as below.

pmcmd and pmrep is invoked




2) If DAC installation directory contains spaces.(space character)

 C:\Program Files (x86)\Oracle Business Intelligence Data Warehouse Administration Console 11g

  If the DAC installation directory contains space character as mentioned above, while doing synchronizing the tasks you will simply get a window saying "success". But the source and target tables will not be added to the tasks.

 But when you debug the issue, you will be stopped here.

file could not be read error

 With the error "C:\Program could not be opened for read.", we can say that, DAC is not able to read the file after space found.

 So make sure your installation directory should not contain any spaces.

3) If Infa_int and Infa_rep connections are not successful.

 Make sure the connections are successful that we create "SETUP" tab, "Informatica servers" tab.


infa_int connection successful

infa_rep connection successful

Related links:



Sunday, August 24, 2014

Full DAC Metadata or Repository tables part 3

Part 3: System Tab related metadata tables:


Execute tab highlight


 In this post you will get more information about Execute tab related metadata tables of DAC(Oracle BI Data Warehouse Administration Console).

1) All execution plans     -    W_ETL_DEFN

    a) Subject areas associated to the EP    -  W_ETL_DEFN_SA
    b) Tables belongs to the EP                 -  W_ETL_DEFN_TBL
    c) All tasks involved in the EP              -   W_ETL_DEFN_STEP 
    d) Connectivity parameters for this EP   - W_ETL_DEFN_PRM

2) Current Run / Run History -   W_ETL_DEFN_RUN

    a) Task belongs to the EP                  -   W_ETL_RUN_STEP
    b) Details for a tasks in EP                -  W_ETL_RUN_SDTL
    c) Audit trial (No.of time EP restarted) -  W_ETL_RUN_AUDIT
    d) Phase summary                            - W_ETL_RUN_STEP
    e) Run type summary                       -   W_ETL_RUN_SDTL

3) EP Schedule information           -  W_ETL_SCHEDULE

Related links: 

DAC Metadata or Repository tables part 2

DAC Metadata or Repository tables part 1

Full DAC Metadata or Repository tables part 2


Part 2: System Tab related metadata tables:


System tab highlight


 In this post you will get more information about system tab related metadata tables of DAC(Oracle BI Data Warehouse Administration Console).

1) DAC system properties    -  W_ETL_SYSPROP

2) Informatica servers         -  W_ETL_SERVER

3) All Physical data sources  - W_ETL_DBCONN

  a) Tables associated to Physical data souce and its "Refresh dates" information                       -    W_ETL_REFRESH_DT

4) E-mail recipients           -  W_ETL_EMAIL

5) Working patches           - W_ETL_PATCH

(Patches created information will be stored here. 

   a) Patch content            -  W_ETL_PATCH_DATA
   b) Child patches            -  W_ETL_PATCH_CHILD

6) Applied patches           -   W_ETL_APATCH

   a) Applied patch content           -  W_ETL_APTC_DATA
   b) Applied child patches            -  W_ETL_APTC_CHLD
   c) Deleted objects by this patch - W_ETL_APTC_DEL

Related links:

DAC Metadata or Repository tables part 1

DAC Metadata or Repository tables part 3


Full DAC Metadata or Repository tables part 1


Part 1: Design Tab related metadata tables: 


Design tab highlight


 In this post you will get more information about design tab related metadata tables of DAC(Oracle BI Data Warehouse Administration Console).

1) All Containers          -  W_ETL_APP
2) All User accounts     -  W_ETL_USER
3) All Subject areas      -  W_ETL_SA
  
   a) All task related to subject area     - W_ETL_SA_STEP
   b) All tables related to subject area   - W_ETL_SA_TABLE

4) All Tables               -  W_ETL_TABLE

   a) All columns for a table                 -  W_ETL_TABLE_COL
   b) All indexes associated for table     -  W_ETL_INDEX
   c) All source/Target for a task           -  W_ETL_STEP_TBL

5) All Indexes             -  W_ETL_INDEX

   a) Columns associates to a index    - W_ETL_INDEX_COL

6) All Task groups       - W_ETL_STEP
  
   a) Associates child tasks - W_ETL_GROUP_STEP

7) All Tasks                -  W_ETL_STEP
   
   a) All Source/Target tables associates to tasks - W_ETL_STEP_TBL
   b) Task associated to a subject areas             -  W_ETL_SA_STEP

8) All Configuration Tags - W_ETL_TAG

   a) All subject areas associated to this CT     -    W_ETL_TAG_SA
   b) All tasks associated to this CT                -    W_ETL_TAG_STEP

9) All source system parameters   -   W_ETL_PARAM

10) All source system folders       -   W_ETL_PARAM

11) Container specific SQL           -   W_ETL_APP_TMPLT

Related links:

DAC Metadata or Repository tables part 2

DAC Metadata or Repository tables part 3




Friday, August 22, 2014

Listing Informatica objects with Query Tool

Issue: List all Informatica objects(Mappings, Sessions, Workflows etc)

Solution: Use inbuilt "Queries" option in Informatica Powercenter Repository Manager tool.

Some of the uses of this "Queries" option.

a) Create list of objects such as Mappings, sessions, workflows etc.
b) Objects with their status( Valid or Invalid).

 In this post I will get the list of workflows in a particular folder using this option. Below are the steps to carryout this activity.

Step 1: Login to "Informatica Powercenter Repository Manager".


Step 2: From the file menu, go to "Tools" and select "Queries".

Tools and Queries option

Step 3: From the file menu, go to "Tools" and select "Queries".

Click New button to create a query

Step 4: Click on the "New" button to create a query and start composing a query like this.

Note: - Use  these button to create a query.

Query window


Step 5: Click on the "Execute" button to list the required objects and the output will be displayed as below.

Objects list output with informatica query tool

Step 6: Click on the "Save" button to save the list. You can later process this data in to an Excel sheet.

Save output list

Related links:


Wednesday, July 23, 2014

DAC task synchronization failed with the error Whitespace required

Issue: DAC task synchronization failed with the error "Whitespace required"

 Today when I synchronized the task in DAC, it got failed with below error. 

MESSAGE:::oracle.xml.parser.v2.XMLParseException: Whitespace required.
EXCEPTION CLASS::: com.siebel.etl.gui.core.MetaDataIllegalStateException

com.siebel.analytics.etl.infa.fileParsing.Infa7WfXmlDomParser.parse(Infa7WfXmlDomParser.java:97)
com.siebel.analytics.etl.infa.fileParsing.Infa7WfXmlDomParser.parseAll(Infa7WfXmlDomParser.java:79)
com.siebel.analytics.etl.infa.fileParsing.Infa7WfXmlDomParser.parseAll(Infa7WfXmlDomParser.java:75)
com.siebel.analytics.etl.infa.fileParsing.TaskSync.sync(TaskSync.java:158)
com.siebel.analytics.etl.client.action.TaskSynchronizationAction.doOperation(TaskSynchronizationAction.java:145)
com.siebel.etl.gui.view.dialogs.WaitDialog.doOperation(WaitDialog.java:54)
com.siebel.etl.gui.view.dialogs.WaitDialog$WaitDialogRunnableHelper.run(WaitDialog.java:86)
java.lang.Thread.run(Unknown Source)

::: CAUSE :::
MESSAGE:::Whitespace required.
EXCEPTION CLASS::: oracle.xml.parser.v2.XMLParseException

oracle.xml.parser.v2.XMLError.flushErrors1(XMLError.java:323)
oracle.xml.parser.v2.NonValidatingParser.parseDocument(NonValidatingParser.java:376)
oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:226)
oracle.xml.jaxp.JXDocumentBuilder.parse(JXDocumentBuilder.java:155)
javax.xml.parsers.DocumentBuilder.parse(Unknown Source)
com.siebel.analytics.etl.infa.fileParsing.Infa7WfXmlDomParser.parse(Infa7WfXmlDomParser.java:91)
com.siebel.analytics.etl.infa.fileParsing.Infa7WfXmlDomParser.parseAll(Infa7WfXmlDomParser.java:79)
com.siebel.analytics.etl.infa.fileParsing.Infa7WfXmlDomParser.parseAll(Infa7WfXmlDomParser.java:75)
com.siebel.analytics.etl.infa.fileParsing.TaskSync.sync(TaskSync.java:158)
com.siebel.analytics.etl.client.action.TaskSynchronizationAction.doOperation(TaskSynchronizationAction.java:145)
com.siebel.etl.gui.view.dialogs.WaitDialog.doOperation(WaitDialog.java:54)
com.siebel.etl.gui.view.dialogs.WaitDialog$WaitDialogRunnableHelper.run(WaitDialog.java:86)

java.lang.Thread.run(Unknown Source)

Did some investigation to resolve the issue. Verified the "Task name", "Command for Full load" and "Incremental load for naming"...etc.

And finally got some clues to resolve the issue.

Solution: Remove the escape characters from the description of the objects( Either in folder,mapping or session...etc objects description).


If you see the workflow xml, description field values in not parsing correctly because of the " (extra double quotes".


 So you just make sure the description column should not contain any special characters and escape characters.

CHEERS :)


Friday, July 18, 2014

DAC 11g Evaluate Parameters Feature

Issue: Sometimes before triggering the Execution plan, we always worrying about what parameters DAC  is going to pass to the Informatica.(Especially in BI Apps environment).

Solution: Use "DAC11g Evaluate parameters" feature available at Execution plan tab.

Here is the little more explanation to make use of it.

Step 1: Login to DAC and click  "Execution" button and select "Execution plan" tab then select your execution plan.

Execution plan selection

Step 2: Go to "Ordered Tasks" child tab and right click on any tasks. Select "Evaluate Parameters".

Right click on the tasks

Step 3: It will ask you to proceed. Click on "Yes".

Evaluate Parameters input window


Step 4: It will take sometime to get the parameters name and corresponding values.


So here is the output of the parameters name and their values.

Note: 

a) You will get to know whether the Task will go for Incremental or Full Load.
b) Initial Extract date and Last Extract Dates.
c) Database Connection values

 All these details are enough for us to stand confidently.

CHEERS :)

Disadvantage of DAC 11g patch feature and the resolution

Hi All,

 In my previous post DAC 11g code migration, we have discussed about how to perform patch migration(or subset code migration). 

 But I faced this issue later sometime with the patch migration.

Traditional Container Migration is taking longer time.

To get rid of this issue, I followed this method.

Taken the tables backup which involves in DAC PATCHING and truncated them.

CREATE TABLE W_ETL_PATCH_BKP AS select * from W_ETL_PATCH;CREATE TABLE W_ETL_PATCH_DATA_BKP AS select * from W_ETL_PATCH_DATA;CREATE TABLE W_ETL_PATCH_CHILD_BKP AS select * from W_ETL_PATCH_CHILD;CREATE TABLE W_ETL_PATCH_AUDIT_BKP AS select * from W_ETL_PATCH_AUDIT;CREATE TABLE W_ETL_APATCH_BKP AS select * from W_ETL_APATCH;

TRUNCATE TABLE W_ETL_PATCH;TRUNCATE TABLE W_ETL_PATCH_DATA;TRUNCATE TABLE W_ETL_PATCH_CHILD;TRUNCATE TABLE W_ETL_PATCH_AUDIT;TRUNCATE TABLE W_ETL_APATCH;

Note:- But take this approach on your own decision.


Saturday, July 12, 2014

Important DAC system properties

 In this post, we will discuss few important DAC system properties.


Name Value Default value Description
Allow Clients To Remember User Password TRUE TRUE When set to true, this property will allow the clients to remember the user's password while logging into repository.
Such passwords, though encrypted, are stored in the client machine's file system.
While accessing critical environments, like Production environments, the clients that connect to
the repository may not be trustworthy.  Under such circumstances, the Administrator can set this flag to 'false'.
When set to false, even if on the login screen the user clicks on remember password while connecting to such a repository,
the client will warn that the password cannot be remembered while logging into this repository, and the user
will be required to reenter the password every time he/she logs in.


One of my project DEV DAC server is not allowing me to save password. When I spent some time to get this issue resolved.
Concurrency Level 1 1 Maximum number of ETLs that can run in parallel. When concurrency of 1 is set, ETL will run
in the same process space with the DAC Server. When multiple ETLs are allowed,
they will run as separate OS processes.  Invalid value will be interpreted as 1.
Restart the DAC Server to effect the changes.

By Default only one EP will in DAC. When you configure this value, you will be able to mutiple EP's at the same time(provided by they are independent of each other)
Dryrun FALSE FALSE When set to true, all the Tasks will be executed without invoking Informatica Workflows.  The change capture, truncation of tables,
drop/create indexes and analyze statements will be executed for real. This should be treated as a debugging option and should never be used
in production mode.  The possible values are TRUE or FALSE.

The changes will be effective for the next ETL. Restarting the DAC Server is not required.

This option looks like similar to "No Run" option. But be careful while using it. It will not load the data but will do rest of the subtasks(Truncate/Drop/Create Index/Analyze)
No Run FALSE FALSE When set to true, this property will run the execution plan for just generating the tasks and their details.
None of the tasks will be executed, and the refresh dates will not be updated.  Use this only to debug.
The changes will be effective for the next ETL. Restarting the DAC Server is not required.

This is really really helpful option in DAC. I have been asked, whether the EP will run for FULL LOAD/INCREMENTAL LOAD couple of time. Then I have shown proof with this option. 
Scheduler.Poll.Interval 300 300 Value in seconds as to how often DAC server will poll for changes in Schedule configuration.
Restart the DAC Server to effect the changes.

 I have tried to schedule an execution plan to run just after 2 mins. Time is passing and the EP is not running at the expected time. Finally found here, the poll interval is set to 5 mins. Means atleast you have to provide this time for DAC to get the new schedule time.

Friday, April 4, 2014

DAC 11g code migration ( DAC-Oracle Data Warehouse Administration Console)

Issue: How to migrate DAC11g metadata from one environment to another environment( DEV -> UAT) easily.

Solution: DAC 11g Patching feature enables you to migrate subsets of DAC metadata like Tasks, Subject Areas...etc.

-----------------------------------------------------------------------------

 Below are the high level steps in DAC 11g patching concept.

       1. Create patch

       2. Add contents to patch

       3. Close patch

       4. Export Patch

       5. Apply Patch

-----------------------------------------------------------------------------

 Here is the details steps to use Patching feature. 

 In this post we will migrate one single Subject area from one environment to another environment. In DAC 10g we used to migrate the entire container for any code migrations.

 I will use some keywords like DEV and SIT like that. :P

Step 1: Login to DEV dac and select the subject area you want to migrate. In my case, subject area name is "Test_SA".


Login to DAC 11g and select subject area

Step 2: Right click on the subject area, click on "Add Object(s) to Patch".

Rigth click on the subject area

Step 3: Select "Create a new patch" option and give patch name to it. 
             ex:- PATCH_Test_SA

In the "Add to Patch Set" section select "Selected parent object(s) and all child object(s).

Basically this will add below objects to the patch

     a) Subject area
     b) Tasks under subject area
     c) Source and Target tables for the tasks
     d) Table columns
     e) Indexes(if any)

Creating Patch in DAC11g

Step 4: Once the patch creation completes, summary window will be displayed. Click on "OK" button.

DAC11g patch creation summary

Step 5: Navigate to "Setup" tab, then to "Working Patches" tab to see recently created patch. Patch will be "Open" status. This mean still you can add some more DAC objects to it.

DAC 11g patch in open status

Step 6: Right click on the patch, select "Patches" then click on "Close Patch".

Close patch in DAC11g

Step 7: Now the patch status will be changed to "Closed".

Patch status become close

Step 8: To export the patch, right click on the patch, select "Patches" and then click on "Export Patch".

DAC 11g patch export

Step 9: Save the exported patch.

Save patch

Step 10: Exported summary window will be displayed. Migrate this Patch XML to another instance.

Export Patch log summary

Step 11: Login to another DAC 11g instance.

Login to UAT DAC 11g

Step 12: Go to Tools -> DAC Repository Management -> Apply Patch

Apply patch

Step 13: Select previously migrated Patch XML to apply patch.

Select the patch

Step 14: DW tables script generation wizard will be displayed. Click "Cancel" to ignore the warning message. If you want to create data warehouse tables with DAC then click on "OK" to continue.

Generate DW tables script

Step 15: Finally Patch log window will be displayed. Click on "OK" button and verify your objects whether they migrated properly or not.

Applied patch log summary


 This feature really really helps in DAC11g subsets of code migration.

CHEERS :)

Related links: