A Day in the Life of an Enterprise DBA
A Day in the Life of an Enterprise DBA
With systems-management tools such as Oracle Enterprise Manager, DBA s can manage the enterprise from home. Sit in on a day with OEM.
Thanks for coming over to the office. We’ll go through a typical day of database administration. Today’s to-do list covers a variety of Oracle Enterprise Manager (OEM) functionality. We won’t cover everything OEM and Performance Pack can do, but you’ll get a taste of how you can use OEM to take care of typical administration tasks. (For information about Oracle’s new manageability packs, see Manage Globally, Act Locally)
Today’s To-Do List
First, let’s go over the current Oracle systems for which I’m responsible. The company has two production databases, four development databases, and a quality-assurance database. It also has a variety of Oracle Data Server versions, applications, and systems from hardware vendors. Three machines are running UNIX from different vendors, and one is running Microsoft Windows NT.
Today’s task list includes several items:
•Load new account data into the dwhs development database •Change a view, using Schema Manager •Write a stored function for a data mart •Add a new user to the dasm development database •Determine if a database link is working •Calculate the total size of all the databases
In addition to these tasks, I’ll inevitably be interrupted with some calls during the day. Let’s get started. Sit down at the OEM-console machine, and you can follow along as I take on today’s challenges.
Load New Account Data
Yesterday, the warehouse-team manager sent an e-mail message instructing me to load a file into the data-warehouse instance. To do this, I need to create and load a temporary table to find some missing account numbers and locations. The file is now ready for loading. I need to find a place for the data table, create the table, load the data, check and adjust the extents during the load, and create the indexes.
A Place for the Table To find a tablespace with enough room, I start Storage Manager from the console or from the Start menu. Once I’m connected, I can see three items on the navigator display: tablespaces, datafiles, and rollback segments. If I place my cursor on the tablespaces item, it will display a bar graph on the right showing each tablespace, the total room allocated, and the room remaining. This tells me which tablespace has enough space to accommodate the table data.
Once I’ve chosen the tablespace, I create the table in Schema Manager, which lets me manage all schema objects in the database. In the navigator, I move to the user who should own the table. If I can find a table with columns similar to those in the data I’m about to load, I can highlight that table, right-click, and choose Create Like on the pop-up menu that appears. This will create a new table with a structure identical to that of the one highlighted. I can then name the table and remove columns as needed. I can instead create the table from scratch, using a wizard.
When I create the table, I have to be sure to use the Storage tab in the dialog box to specify the tablespace and the extents where the table will be created. Once I’ve done this, I click on the Show SQL button to display the actual SQL that Schema Manager will execute to create the table, which should look something like this:
create table ACCOUNT_TEMP as
account_id varchar2(20),
account_code varchar2(15),
account_name varchar2(40)
storage initial 1M, next 1M maxextent 121
pctincrease 0
tablespace LOAD;
I then choose OK or Apply to create the table. To confirm its creation, I can look in the Navigator window for the table owned by the user.
Prepare to Load the Data I can use Oracle’s Data Manager to load the data into the ACCOUNT_TEMP table. Data Manager can load, import, and export data via SQL*Loader, Import, and Export, respectively.
But before I can load data, I need to create a control file for Data Manager’s SQL*Loader. A control file is an input file used by SQL*Loader that details how to interpret data fields within the data file. (The syntax for this file is in the Oracle Server Utilities Guide.) Then I start up Data Manager and get ready to load the data, using this new control file.
Oracle Tablespace Manager and SQL Worksheet let me monitor table growth while the data is loading. If I open Tablespace Manager and navigate to the tablespace holding the ACCOUNT_TEMP table, it will show me a graphic representation of the extents allocated for the table as well as the number of extents and the size of each extent in blocks and bytes. This information comes from the data dictionary view DBA_SEGMENTS. I can find ACCOUNT_ TEMP by sorting on the first name of the extent or by the owner of the table ACCOUNT_TEMP. The extents will be allocated to accommodate table growth as the data loads.
Next I open SQL Worksheet as the table owner and use a simple query to count the rows in the table:
Select count(*) from ACCOUNT_TEMP;
Executing this statement will yield 0 rows prior to the data-load process. I have to be prepared to alter extents by using Schema Manager as the table load proceeds.
To recap what I’ve done so far in preparation for loading the data, I have:
•Started Storage Manager to monitor tablespace size •Set Data Manager to load the data •Run Tablespace Manager to monitor extent allocation •Used SQL Worksheet to count table rows •Used Schema Manager to adjust extents if needed
Now I’m ready to load the data.
Begin Loading the Data Once all the applications are ready, I start the data load from Data Manager, which will show output indicating the number of rows allocated. From within SQL Worksheet, I execute the row count by pressing F5 on my keyboard. And from Tablespace Manager, I occasionally refresh the display of extents, to examine how the table is growing as the load proceeds.
Adjust Table Extents As the table load reaches 1 million rows, I look at Tablespace Manager to check the number of extents listed for the ACCOUNT_TEMP table segment and notice that it has already allocated 50 extents. I then check Storage Manager to see how the tablespace room is holding up. When I do a refresh, it shows that the tablespace still has plenty of room but that we need to adjust the table extents.
I decide to make the table extents bigger. First, I switch to Schema Manager and open the dialog box for the ACCOUNT_TEMP table. I originally set the next extent value to 1 megabyte, but now I change this value to 10 megabytes, select Show SQL, look at the output, and select Apply. Now, when the next extent is allocated, it will be 10 megabytes and not 1 megabyte.
To confirm this change, I look again at Tablespace Manager: As extent 50 fills and extent 51 is allocated, it’s now 10 megabytes instead of 1 megabyte. Mission accomplished–ORA error avoided!
Create Indexes and Analyze Extents Once the data is loaded, I use Schema Manager to create indexes for the ACCOUNT_ TEMP table. Finally, I analyze the table with a SQL command from SQL Worksheet so the cost-based optimizer will be available for the new table. This process takes several minutes.
Change a View, Using Schema Manager
A view on the dfhr database needs modification. The development team has been creating customized views to access the human-resources tables, but they need me to add three more table columns to the view for e-mail address, work telephone number, and marital status.
In the old days, I would have kept source code on a network drive, modified the code, and replaced the view from the file. Now, I can use Schema Manager to modify the database views on the development instances.
I first start up Schema Manager and navigate to the user who owns the view. Next, Schema Manager displays the available views in the table listing on the left pane of the display. I find the HR_STAFF view to modify and select it. The right pane now contains a dialog box with data- dictionary information about the HR_STAFF view. It lists the view’s owner and status. But more important, I can view and modify the source code. As I enter the changes into the source window, the Apply button becomes available, but before applying the changes, I click on Show SQL to display the actual SQL that is about to execute.
Write a Stored Function
The data-mart project needs a stored function that will select a region for a given account. To create the function, I first start up SQL Worksheet. Since the users’ specifications weren’t very clear, I’ll have to experiment with some SQL to find code that will satisfy the request.
In SQL Worksheet, I type into an editor the SQL commands that will be able to fetch a region ID for a given account. As I try different options, I can retrieve previously executed commands from a command buffer stored in SQL Worksheet. I can edit the SQL in the input window without using the cryptic edit commands as provided in SQL*Plus: Instead, I simply use SQL Worksheet to edit the SQL in a notepad style and execute it when I’m ready. I can execute several commands or load a script for execution from the input window. (Although SQL*Plus has a vast array of superior functionality, SQL Worksheet provides a valuable method for dabbling with SQL.)
Once I have the SQL I need in order to provide the functionality requested, I can add PL/SQL function code. I add the “create or replace function” code, parameters, cursors, variables, flow control, and exception handling to produce a PL/SQL stored function in the SQL Worksheet input window.
When I execute the code, the output window reports that the function created has errors. I need to see the errors and clean up the code. Because the function code is now stored on the database, I can work from Schema Manager. I find the new stored function and can see the function name, owner, status, and source code.
Clicking on the Show Errors dialog box shows me the errors currently stored in the data-dictionary DBA_ERRORS view. I edit the code for the function and press Apply, continuing this editing and error-review process until the function compiles successfully.
Add a User to a Database
The assembly-line application team needs a user created on the dasm development database exactly like user A122X53 on the production database. Everything for this user must be identical–same roles, privileges, password, and so on.
Starting with the pasm production database, I use the console to navigate to users and find user A122X53. With the dasm database in view on the map window of the console, I select user A122X53 on production, hold down the right mouse button, drag the cursor to the dasm icon in the map view, and let go of the mouse button. That’s it!
Wait a minute. Maybe that was too easy. I want to make sure it worked, so I open Security Manager on the dasm database instance, find user A122X53, and look at the user’s properties. Sure enough–the password, profile, quotas, tablespace settings, privileges, and roles are intact. (Note, however, that the objects owned by A122X53 on the production database are not created in this procedure. The user will still be created with default database values even if the corresponding roles, profiles, or tablespaces don’t exist on the target database.)
Check a Database Link
An application developer reported on voice-mail late yesterday that her script for pulling production data to the development database did not work. She needs it fixed by tonight.
After I sift through the job output, it appears there is a problem with the database link. Opening up Schema Manager again, I navigate from user “public” to the database link used by this development instance to access the production database. I highlight the link and edit the properties. When I click on the Test button, the message displayed confirms that the link is inactive. Two things come to mind: Someone has changed the password for the user connecting or someone has tampered with the server SQL*Net files. After investigating, I find that the user was dropped during a database refresh, so I re-create the user and test the link as before. This time, the link is active.
Calculate the Total Size of the Databases
The director needs a summary of the total space allocated for data and indexes on all the databases as well as the total amount actually used. To do this, I need information from all the databases. Rather than move a script to each of the machines or reconnect to each instance and run a job, I can use the OEM job scheduler.
First, I select Create Job from the console’s Job menu to create a SQL*Plus job called “Get Database Size and Capacity.” In the script window on the Parameters tab, I type the text for commands as I’d enter them at a SQL*Plus prompt. Because I need a quick overview of database files and the objects in them, I use commands such as these:
Set echo off;
Select to_char((sum(bytes) / 1024*1024))’
“Datafile Disk Allocated (Mb)” from dba_data_files;
Select to_char((sum(bytes) / 1024*1024)),’
“Object Space Used (Mb)”
from dba_segments
where segment_type in (‘TABLE’,’INDEX’);
Now I can submit the created job against any of the databases from the console’s job-scheduling window. For the job to run on each database, the intelligent agent must be properly configured on each host machine. Once the jobs are submitted, I can watch the progress in the Active Jobs tab of the console’s Job pane. Jobs progress from Submitted to Scheduled to Running to Complete. Once the jobs are complete, I review the output for each database by double-clicking on the Completed line in the job-history pane, selecting the Output button to see the output. I can then copy and paste the output into a word processor or spreadsheet to format it for the director.
When a job is submitted, the console sends information via the communication daemon on the client machine to the intelligent agent running on the managed nodes. The agent registers and runs jobs by starting a tool-command-language (TCL) script on the host machine. As jobs run and complete, SQL*Plus executes the job commands on the server as the user defined in the Preferred Credentials of the console for this database. The SQL*Plus output captured by the agent is sent back to the communication daemon for display on the console. The intelligent agent runs the remote script for me!
The total time to create the job, submit it, run it on all seven databases–on four machines from four vendors–and copy the output into another program? Ten minutes.
Diagnose a Slow System
During the day, several line supervisors have called to complain that the production system is running slowly: Instead of the typical 4 seconds to enter widget information, it now takes 18 seconds. As a result, production is being delayed.
From the console, I highlight the pasm production database instance and start Performance Monitor. Everything looks fine from the overview graphical display, including hit ratios, memory sorts, and connected users. But as I watch the display update every 30 seconds, I notice that the I/O rate spikes. To investigate further, I create a graph based on input/output parameters, which shows me that the redo logs are getting bursts of activity.
To find out who is hitting the system so hard, I start up TopSessions and look at the user-resource usage on the system. I sort based on redo activity to find the culprit. A developer is inserting data into the database and causing significant redo-log activity. A double-click on the user shows me the actual SQL that has been executed. Apparently, the developer is loading data onto the production system during production time. Should I call him before I kill his session? No– kill first, call later. He should know better. Soon after, the supervisors report that the system is running well again.
Troubleshoot a Hung User Session
A developer calls complaining that his SQL statement has been hung for 20 minutes. This sounds like a locking issue, so I start Oracle Lock Manager to see blocking locks. It shows that the session is being locked. To find out why, I select the session that has a table-level exclusive lock. It shows that the blocked-session machine and operating-system user are the same as the blocked session. By reviewing the SQL of the blocked session, I see that this poor guy has locked himself out with a Select for Update cursor. He probably wrote some slow SQL, got impatient, rebooted, and then tried his SQL again and is now being blocked by his previous session. I kill both sessions from Oracle Lock Manager and then give him a call to let him know about the upcoming Oracle Channel class on PL/SQL.
Respond to Pager Message
My alphanumeric pager goes off and shows the following:
Unable to extend object SO_LINES in tablespace by OE 1024000 on database pfhr
The console shows a red flag on the pfhr-database icon in the map pane and lists the same message in the event pane. The agent on the Sun Microsystems E5000 machine has trapped a Chunk Small event on the Oracle Financials and Oracle Human Resources instances. The SO_LINES table won’t be able to extend when necessary because the OE tablespace is running out of room.
The users have not encountered an error yet, but they will as the table grows. The registered event set has caught a potential error and sent the message to my pager long before the error occurred.
I can fix this problem in one of several ways. By providing more room in the tablespace, I can avert a future error. Using Tablespace Manager, I correct the error by adding a datafile to the tablespace containing the SO_LINES table.
I choose one of the datafiles in the tablespace, right-click on it, select Create Like from the pop-up menu, fill in the datafile name, and increment the file number by 1. Create Like fills in all parameters except the datafile name to match the other file. Before creating the datafile, I review the SQL to be executed, as a sanity check.
Once the file is created and a little time has passed, I recheck the console and find a green flag displayed on the pfhr-database icon in the map window. The Outstanding Events tab shows that the previous event has cleared, so I move the event from Outstanding to History with a comment to mention this on my weekly status report.
Quitting Time!
Not a bad day’s work.
Data load? Check.
Change view? Check.
Create stored function? Check.
Add new user? Check.
Fix database link? Check.
Send size report to director? Check.
Fix production slowdown? Check.
Help application developer? Check.
Avert production crisis? Check.
I carry my empty coffee cup to the kitchen and head for the basement to change a load of wash before I go to the roller rink to coach the kids’ roller hockey team. Working from home is a pretty good deal. Since the company set up a remote-access server in the office, I can dial in and use an OEM repository on my Microsoft Windows NT database here at home.
I shut down the PC and call it a day. A day in the life of an enterprise DBA.