Step by Step Golden Gate
Step by Step Golden Gate
Step by Step Golden Gate
Extract
The Extract process runs on the source system and is the data capture mechanism of
GoldenGate. It can be configured both for initial loading of the source data as well as to
synchronize the changed data on the source with the target. This can be configured to
also propagate any DDL changes on those databases where DDL change support is
available.
Replicat
The Replicat process runs on the target system and reads transactional data changes as
well as DDL changes and replicates then to the target database. Like the Extract process,
the Replicat process can also be configured for Initial Load as well as Change
Synchronization.
Collector
The Collector is a background process which runs on the target system and is started
automatically by the Manager (Dynamic Collector) or it can be configured to stsrt
manually (Static Collector). It receives extracted data changes that are sent via TCP/IP
and writes then to the trail files from where they are processed by the Replicat process.
Trails
Trails are series of files that GoldenGate temporarily stores on disks and these files are
written to and read from by the Extract and Replicat processes as the case may be.
Depending on the configuration chosen, these trail files can exist on the source as well
as on the target systems. If it exists on the local system, it will be known an Extract Trail
or as an Remote Trail if it exists on the target system.
Data Pumps
Data Pumps are secondary extract mechanisms which exist in the source configuration.
This is optional component and if Data Pump is not used then Extract sends data via
TCP/IP to the remote trail on the target. When Data Pump is configured, the Primary
Extract process will write to the Local Trail and then this trail is read by the Data Pump
and data is sent over the network to Remote Trails on the target system.
In the absence of Data Pump, the data that the Extract process extracts resides in
memory alone and there is no storage of this data anywhere on the source system. In
case of network of target failures, there could be cases where the primary extract process
can abort or abend. Data Pump can also be useful in those cases where we are doing
complex filtering and transformation of data as well as when we are consolidating data
from many sources to a central target.
Data source
When processing transactional data changes, the Extract process can obtain data
directly from the database transaction logs (Oracle, DB2, SQL Server, MySQL etc) or
from a GoldenGate Vendor Access Module (VAM) where the database vendor (for
example Teradata) will provide the required components that will be used by Extract to
extract the data changes.
Groups
To differentiate between the number of different Extract and Replicat groups which can
potentially co-exist on a system, we can define processing groups. For instance, if we
want to replicate different sets of data in parallel, we can create two Replicat groups.
Steps :-
SOURCE Database
inflating: fbo_ggs_Linux_x86_ora11g_32bit.tar
Copyright (C) 1995, 2012, Oracle and/or its affiliates. All rights reserved.
Configure Schema
create tablespace
create user
Give Grants
grant connect, resource to ggs_owner;
grant select any dictionary, select any table to ggs_owner;
grant create table to ggs_owner;
grant flashback any table to ggs_owner;
grant execute on dbms_flashback to ggs_owner;
grant execute on utl_file to ggs_owner;
grant create any table to ggs_owner;
grant insert any table to ggs_owner;
grant update any table to ggs_owner;
grant delete any table to ggs_owner;
grant drop any table to ggs_owner;
Change Parameter as per requirement
UNDO_MANAGEMENT=AUTO
UNDO_RETENTION=86400
Script :
SQL> startup
Database mounted.
Database opened.
SQL>
System altered.
Database closed.
Database dismounted.
Database altered.
Database altered.
tablespace created
User created.
System altered.
Database altered.
Database closed.
Database dismounted.
SQL> startup
ORACLE instance started.
Database mounted.
Database opened.
INITIAL DATALOAD :-
EXPORT: @ SOURCE
$ expdp directory=db_dir
dumpfile=schema_gg.dmp logfile=schema_gg.log schemas=ggtest Scp from SOURCE
and TARGET
$scp p schema_gg.dmp 172.168.10.108:/oradata
IMPORT: @ TARGET
$impdp directory=db_dir
dumpfile=schema_gg.dmp logfile=schema_imp_gg.log schemas=ggtest
Execute following script at Source by connecting SYS user which will create
required object for golden gate.
Run scripts for creating all necessary objects for support DDL replication
Connected to:
With the Partitioning, OLAP, Data Mining and Real Application Testing options
SQL> @marker_setup.sql
Marker setup script
You will be prompted for the name of a schema for the Oracle GoldenGate database
objects.
Please enter the name of a schema for the GoldenGate database objects:
MARKER TABLE
OK
MARKER SEQUENCE
OK
Script complete.
SQL> @ddl_setup.sql
Oracle GoldenGate DDL Replication setup script
You will be prompted for the name of a schema for the Oracle GoldenGate database
objects.
NOTE: For an Oracle 10g source, the system recycle bin must be disabled. For Oracle 11g
and later, it can be enabled.
NOTE: The schema must be created prior to running this script.
Check complete.
Please enter the name of a schema for the GoldenGate database objects:
CLEAR_TRACE STATUS:
Line/pos Error
No errors No errors
CREATE_TRACE STATUS:
Line/pos Error
No errors No errors
TRACE_PUT_LINE STATUS:
Line/pos Error
-
No errors No errors
INITIAL_SETUP STATUS:
Line/pos Error
No errors No errors
Line/pos Error
No errors No errors
Line/pos Error
No errors No errors
Line/pos Error
No errors No errors
OK
DDL IGNORE LOG TABLE
OK
Line/pos Error
No errors No errors
Line/pos Error
No errors No errors
Line/pos Error
No errors No errors
Line/pos Error
No errors No errors
OK
OK
OK
OK
OK
OK
OK
DDL SEQUENCE
OK
GGS_TEMP_COLS
OK
GGS_TEMP_UK
OK
Line/pos Error
No errors No errors
OK
ENABLED
STAYMETADATA IN TRIGGER
OFF
/u01/app/oracle/diag/rdbms/src/SRC/trace/ggs_ddl_trace.log
Analyzing installation status
Script complete.
SQL> @role_setup.sql
GGS Role setup script
To use a different role name, quit this script and then edit the params.sql script to
change the gg_role parameter to the preferred name. (Do not run the script.)
You will be prompted for the name of a schema for the GoldenGate database objects.
Grant this role to each user assigned to the Extract, GGSCI, and Manager processes, by
using the following SQL command:
SQL>
Grant succeeded.
SQL> @ddl_enable
Trigger altered.
Configure Source :-
1. Add Trandata
i. Create trandata statement and execute it on gg prompt
SQL> select add trandata ||owner||.||object_name||; from dba_objects where
owner=GGTEST and object_type=TABLE;
ADDTRANDATA||OWNER||.||OBJECT_NAME||;
PORT 7809
,RETRIES 20 ,WAITMINUTES 2
Delete GG trails if it is older than 3 days and there are no GG process with a checkpoint
to it
Report any lag every 60 Minutes. anytimes lag exceeds 20 Minutes immediately
generate a critical message
LAGREPORTMINUTES 60
LAGCRITICALMINUTES 20
TARGET DATABASE
DELETE GGEC01G1
, MEGABYTES 5
ii. Create parameter file by edit params command and put below code
in the parameter file.
1. Edit params GGEC01G1
Created By :- Sachin Ichake
CHECKPARAMS
EXTRACT GGEC01G1
Source Database
FORMATSQL
EXTTRAIL /u01/soft/dirdat/g1
To check the parameter syntax, uncomment the below 2 lines and comment out
WILDCARDRESOLVE DYNAMIC parameter. start the group
CHECKPARAMS
NODYNAMICRESOLUTION
NOCOMPRESSDELETES
NOCOMPRESSUPDATES
IGNOREDELETES
IGNOREUPDATES
IGNOREINSERTS
CHECKPARAMS
NODYNAMICRESOLUTION
Runtime parameters
STATOPTIONS RESETREPORTSTATS
REPORT AT 00:01
REPORTROLLOVER AT 00:01
WILDCARDRESOLVE DYNAMIC
TABLE GGS_OWNER.EMP;
d. Configure Pump
i. Create Obey file if required and put below code into that or directly
execute it on gg prompt.
Created By :- Sachin Ichake
TARGET DATABASE
EXTRACT PUMP
DELETE GGPC01G1
ii. Create parameter file by edit params command and put below code
in the parameter file.
1. Edit params GGPC01G1
Created By :- Sachin Ichake
CHECKPARAMS
EXTRACT GGPC01G1
PASSTHRU
Control Parameters
RMTTRAIL /u01/soft/dirdat/1g
To check the parameter syntax, uncomment the below 2 lines and comment out
WILDCARDRESOLVE DYNAMIC parameter. start the group
CHECKPARAMS
NODYNAMICRESOLUTION
WILDCARDRESOLVE DYNAMIC
TABLE GGS_OWNER.EMP;
TABLE EMP FILTER(ORANGE(1,2));
Configure Target :-
1. Edit GLOBAL parameter file by connecting gg prompt
i. edit params ./GLOBAL
GGSCHEMA ggs_owner
PORT 7809
,RETRIES 20 ,WAITMINUTES 2
Delete GG trails if it is older than 3 days and there are no GG process with a checkpoint
to it
Report any lag every 60 Minutes. anytimes lag exceeds 20 Minutes immediately
generate a critical message
LAGREPORTMINUTES 60
LAGCRITICALMINUTES 20
Add replicat
DELETE GGRC01G1
ii. Create parameter file by edit params command and put below code
in the parameter file.
Created By :- Sachin Ichake
SHOWSYNTAX
NODYNSQL
NOBINARYCHARS
REPLICAT GGRC01G1
HANDLECOLLISIONS
TARGET DATABASE
#DB_Connect()
ASSUMETARGETDEFS
SOURCEDEFS C:\GG\DIRSQL\MYTABLES.SQL
To check the parameter syntax, uncomment the below 2 lines and comment out
WILDCARDRESOLVE DYNAMIC parameter.
CHECKPARAMS
NODYNAMICRESOLUTION
#generate_stats()
REPORTCOUNT EVERY 1 HOUR, RATE
STATOPTIONS RESETREPORTSTATS
REPORT AT 00:01
REPORTROLLOVER AT 00:01
WILDCARDRESOLVE DYNAMIC
TRANSACTIONTIMEOUT 5 S
ddl support