
Hello, dear habrachiteli! Naturally, the desire of each employer to evaluate the effectiveness and quality of the work performed by employees, to increase profits and reduce costs. IT infrastructure support is always a “black box”. For what money is paid, because nothing is broken yet? Since no one manager will most likely want to delve into the problematic, it is very likely that there will soon be a need for a report on the work done, and preferably daily. Considering beautiful tsiferki and slender graphics, the customer invariably "comes to taste." Gradually, there are reports of monitoring infrastructure, the state of backups, missed incidents, the work of DLP systems. And the further, the worse. IT support starts to lose in efficiency, the team’s work schedule changes, because the report is needed in the morning. This practice incredibly stimulates mental activity in the direction of the rapid provision of data to which you have almost exclusive access. My way to solve this problem I will try to describe further.
Report report chases
The customer is a very large production company with a huge number of stores and warehouses. She loves Oracle in a Windows environment (which is rare). We do not consider factories, our goal is warehouses and shops, and all the DBMS that are spinning there.
It is known that new DBMS instances are created on a regular basis by developers or testers - they can easily not inform anyone about the existence of their test environment, but they will panic when they accidentally delete it due to server decommissioning or regular virtualization. There are god-forgotten servers with databases 10 years ago. They still work with any store or warehouse. Where are the bases located (at least geographically) - no one knows, including the customer and the monitoring system for $ 10K. These DBMS were never brought there. As already stated, the specificity is such that most of the Oracle DBMS is deployed in a Windows environment. Win-instans already about 200 and collect such information about them is very difficult. There is also Oracle for Linux. There are only 40 such bases, of which there are 40 pieces. There is another serious plus — the servers have a Name convention on location: find the name of the server — find its location.
')
For the development of the report we will use PowerShell. Why? For:
- Work is carried out from the terminal machine of Windows Server 2008. There is no access to other servers from outside.
- Excel is there! PowerShell wonderfully works with it as with a com object. You do not need to look for modules, like, for example, Python, since everything is already embedded in NET.
- Most of the servers we still have Windows.
- I have more experience with PowerShell.
In order to access and obtain information from Linux-hosts, we still install Cygwin. All scripts and reports will then be in one place, and this is good. The task of the report: there is a long-term security audit of Oracle databases in connection with the migration of the DBMS to the virtual environment.
It is necessary to determine:
- how many bases do we have
- What is their current state?
- which servers are located, are they running at all,
- how to consume resources, which version of Oracle is deployed.
Start with linux
For some reason, local developers are afraid of them, so all the DBMSs are in production. Servers are known, they are few. We scan the list of Linux hosts and get the resulting file in our directory. To search for Oracle instances, we are looking for a running Pmon process with a simple one-line bash script.
Script 1:
for line in $(cat file.txt) do ssh oracle@$line '$(ps -e -o cmd | grep ora_pmon |grep -v grep|cut -c 10-19 > /tmp/result.txt) ; while read i ; do my_var=$(echo $i ); echo $(hostname -s)";"${my_var##*_}";;;"; done < /tmp/result.txt ; rm /tmp/result.txt' >>script_files/FileOra2.csv done
Windows is our everything
Here we will not find Pmon, the whole Oracle is implemented as one multi-threaded process. We will bypass Windows-hosts with the help of Windows Management Interface. Oracle instance will be found in Windows services. Use PowerShell:
Script 2
$MLpath= 'c:\scripts\DBA\script_files\ML.txt' $MLdir= [System.IO.Path]::GetDirectoryName($MLPath) $outfile=$($MLdir +'\'+'FileOra.csv') $Dbfile= $($MLdir +'\'+'DBList.csv') $hosts=get-content $MLpath -Force $a= foreach ($pc in $hosts){ write-host "test $pc" try{ <
What's next
Having collected a list of hosts and databases, the first step to updating information has been made. First of all, I started a single user in each DBMS, from which I performed further actions. It is time to gather information. It would be possible to use SQL * Plus, but since we are working with com objects, it is better to use OLEDB for Oracle. To do this, we will install an OLEDB-provider on our terminal and execute the query we are interested in in each DBMS. You can download it, for example, from the official website of Oracle. In the system requirements for OLEDB we see about the following:
- Access to an Oracle Database (Oracle 9.2 or later)
- Oracle Client release 11.1 or later and Oracle Net Services (included with Oracle Provider for OLE DB installation).
Now you can abstract from the operating system on the servers. We create the connector, execute the query in each separate database and save the results in a file. I, however, use the script 3 separately, executing any arbitrary queries to the DBMS list, such as free space, SGA, PGA parameters, user lists, and their password strength (HASH for Oracle passwords can be easily found on the Internet). Some query characters will require escaping in PowerShell — in this case, it is convenient to use the Oracle function CHR, which returns a character from the ASCII character set table. Also, we will get a separate list of hosts to which we could not connect with the error codes for further analysis.
Script 3
function Get-OLEDBData ($connectstring, $sql) { $OLEDBConn = New-Object System.Data.OleDb.OleDbConnection($connectstring) $OLEDBConn.open() $readcmd = New-Object system.Data.OleDb.OleDbCommand($sql,$OLEDBConn) $readcmd.CommandTimeout = '10' $da = New-Object system.Data.OleDb.OleDbDataAdapter($readcmd) $dt = New-Object system.Data.datatable [void]$da.fill($dt) $OLEDBConn.close() return $dt } $date=(get-date).toshortdatestring().replace("/",".") $log = "$("$date" +"_"+ 'error')" $db = "$("$date" +"_"+ 'DBlist')" $qry= 'select INSTANCE_NAME,HOST_NAME,VERSION from V$INSTANCE' gc c:\_tir\fileORA.csv| % { $row = $_.split(";") $hostname = $row[0] $service = $row[1] $connString = "password=xxxxXXXxxx;User ID=ORAUSER;Data Source=$hostname/$service;Provider=OraOLEDB.Oracle" try { Get-OLEDBData $connString $qry} catch {Write-Output $("$Compname" +';'+ $_.Exception.Message) >> C:\_tir\$log.log } }|Export-Csv c:\_tir\$db.csv -delim ';'
Induce beauty
Text files are ugly. We combine all the results in the catalog into a daily Excel report. We work with Excel sheet as with a regular object. Hide the sheet so that the operation goes faster. The report is sent to your email. Finally, we will update our TNSNAMES file for the convenience of further connecting to the database via SQL * Plus. We use the correct syntax of the file (never could remember it).
Script 4
$date=(get-date).toshortdatestring().replace("/",".") $MLpath= 'c:\scripts\DBA\script_files\ML.txt' $MLdir= [System.IO.Path]::GetDirectoryName($MLPath) $outfile=$($MLdir +'\'+'FileOra.csv') $Dbfile= $($MLdir +'\'+'DBList.csv') $Dbfilexls= $($MLdir +'\'+'DBList'+ $date +'.xlsx') #$logFile= [System.IO.Path]::Combine($MLdir,$("{0}.log" -f $sourceFileName )) gc $outfile|Sort-Object -Unique|out-file $Dbfile -Force <#creating excel doc#> $excel = new-object -comobject excel.application $excel.visible = $false $workbook = $excel.workbooks.add() $workbook.workSheets.item(3).delete() $workbook.WorkSheets.item(2).delete() $workbook.WorkSheets.item(1).Name = "Databases" $sheet = $workbook.WorkSheets.Item("Databases") $x = 2 $colorIndex = "microsoft.office.interop.excel.xlColorIndex" -as [type] $borderWeight = "microsoft.office.interop.excel.xlBorderWeight" -as [type] $chartType = "microsoft.office.interop.excel.xlChartType" -as [type] For($b = 1 ; $b -le 5 ; $b++) { $sheet.cells.item(1,$b).font.bold = $true $sheet.cells.item(1,$b).borders.ColorIndex = $colorIndex::xlColorIndexAutomatic $sheet.cells.item(1,$b).borders.weight = $borderWeight::xlMedium } $sheet.cells.item(1,1) = "Hostname" $sheet.cells.item(1,2) = "Instance" $sheet.cells.item(1,3) = "state" $sheet.cells.item(1,4) = "path" $sheet.cells.item(1,5) = "autorun" Foreach ($row in $data=Import-Csv $Dbfile -Delimiter ';' -Header name, value, path, state, start) { $sheet.cells.item($x,1) = $row.name $sheet.cells.item($x,2) = $row.value $sheet.cells.item($x,3) = $row.path $sheet.cells.item($x,4) = $row.state $sheet.cells.item($x,5) = $row.start $x++ } $range = $sheet.usedRange $range.EntireColumn.AutoFit() | Out-Null $Excel.ActiveWorkbook.SaveAs($Dbfilexls) if($workbook -ne $null) { $sheet = $null $range = $null $workbook.Close($false) } if($excel -ne $null) { $excel.Quit() $excel = $null [GC]::Collect() [GC]::WaitForPendingFinalizers() } IF(Test-Path $MLdir\tnsnames.ora ) { remove-item $MLdir\tnsnames.ora -Force } ELSE { Write-Host "new tnsora" } <# Update TNSORA file#> gc $Dbfile| % { $row = $_.split(";") $hostname = $row[0] $service = $row[1] $name=$service+'_'+$hostname "$name = (DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST = $hostname)(PORT = 1521)) ) (CONNECT_DATA = (SERVICE_NAME = $service) ) )">> $MLdir\tnsnames.ora } <#Mail report to #> $filename= $Dbfilexls= $($MLdir +'\'+'DBList'+ $date +'.xlsx') $smtpServer = “server_name” $smtp = new-object Net.Mail.SmtpClient($smtpServer) $credentials=new-object system.net.networkcredential("server_name\mail","Dfgtnb451") $smtp.credentials=$credentials.getcredential($smtpserver,”25”,”basic”) $msg= New-Object net.Mail.MailMessage $att = new-object Net.Mail.Attachment($filename) $msg.from = “user@yourdomain.com” $msg.to.add(“timur@rrrr.xxx, valentin@rrrr.xxx”) $msg.subject = “Database_Report” $msg.body = "DAtabase report sample body" $msg.isbodyhtml= "false" $msg.Attachments.Add($att) $smtp.Send($msg)
It remains to make our scripts in the Windows-scheduler. First, we collect information about Oracle instances on Linux and Windows-hosts, respectively (1 and 2 scripts). Next we connect to each DBMS and collect information (script 3). Then we build an aggregated Excel report and send it by mail (script 4).
Conclusions: for what they fought - that is what ran into
- Put things in order. There were ancient bases from Oracle 9 and the silent installations of Oracle 12. Finally, they got rid of them.
- I set up automatic forwarding of a letter from my Exchange mailbox to interested parties. Always aware of changes in infrastructure.
- I know when and where the base fell, comparing two reports.
- I gradually pulled apart parts of this solution in other applications. Always got the current list of Tnsnames
- During the creation of the report script, there were holes in the entrance with standard Oracle passwords, like scott / tiger with elevated access privileges. Therefore, after creating the report, we had to carry out a separate security audit with enumeration of standard passwords and create a report already on the list of passwords. Thus, the solution came in handy twice.
Successful automation!