Just a quick code snippit. I do a lot of data pumps to move schemas between different databases; for example taking a copy of a schema to an internal database to try to reproduce a problem. Some of these schemas have some very large tables. The large tables aren’t always needed to research a particular problem.
Here’s a quick bit of SQL to list the 20 largest tables by total size – including space used by indexes and LOBs. A quick search on google didn’t reveal anything similar so I just wrote something up myself. I’m pretty sure this is somewhat efficient; if there’s a better way to do it then let me know! I’m posting here so I can reference it in the future. :)
Just a quick code snippit. I do a lot of data pumps to move schemas between different databases; for example taking a copy of a schema to an internal database to try to reproduce a problem. Some of these schemas have some very large tables. The large tables aren’t always needed to research a particular problem.
Here’s a quick bit of SQL to list the 20 largest tables by total size – including space used by indexes and LOBs. A quick search on google didn’t reveal anything similar so I just wrote something up myself. I’m pretty sure this is somewhat efficient; if there’s a better way to do it then let me know! I’m posting here so I can reference it in the future. :)
Just a quick code snippit. I do a lot of data pumps to move schemas between different databases; for example taking a copy of a schema to an internal database to try to reproduce a problem. Some of these schemas have some very large tables. The large tables aren’t always needed to research a particular problem.
Here’s a quick bit of SQL to list the 20 largest tables by total size – including space used by indexes and LOBs. A quick search on google didn’t reveal anything similar so I just wrote something up myself. I’m pretty sure this is somewhat efficient; if there’s a better way to do it then let me know! I’m posting here so I can reference it in the future. :)
This is the fourth of twelve articles in a series called Operationally Scalable Practices. The first article gives an introduction and the second article contains a general overview. In short, this series suggests a comprehensive and cogent blueprint to best position organizations and DBAs for growth.
This is the fourth of twelve articles in a series called Operationally Scalable Practices. The first article gives an introduction and the second article contains a general overview. In short, this series suggests a comprehensive and cogent blueprint to best position organizations and DBAs for growth.
This is the fourth of twelve articles in a series called Operationally Scalable Practices. The first article gives an introduction and the second article contains a general overview. In short, this series suggests a comprehensive and cogent blueprint to best position organizations and DBAs for growth.
Tanel does offer a zip file with all of his scripts. The zip seems up-to-date now; I started doing this alternative technique awhile ago when the zip file didn’t seem to get updated as quickly as the raw scripts directory.
mkdir tpt cd tpt wget -r -nH --cut-dirs=2 --no-parent --reject="index.html*" http://blog.tanelpoder.com/files/scripts/ cd .. [svn/git] add tpt [svn/git] commit tpt -m "added Tanel Poder's script library to our script repository"
Please remember that as Tanel says on his own website, “always proofread the scripts and test their effect out in a test environment before running in production.”
Tanel does offer a zip file with all of his scripts. The zip seems up-to-date now; I started doing this alternative technique awhile ago when the zip file didn’t seem to get updated as quickly as the raw scripts directory.
mkdir tpt cd tpt wget -r -nH --cut-dirs=2 --no-parent --reject="index.html*" http://blog.tanelpoder.com/files/scripts/ cd .. [svn/git] add tpt [svn/git] commit tpt -m "added Tanel Poder's script library to our script repository"
Please remember that as Tanel says on his own website, “always proofread the scripts and test their effect out in a test environment before running in production.”
Just filing this away on my own blog because it seems that I always have hard finding it via google. Here’s some code to convert raw/hex values into timestamps. (I’ve come across this need in two situations: [1] bind variables recorded in trace files or sql monitor and [2] hi/low values in column statistics.)
As a query, without creating any objects in the database:
Just filing this away on my own blog because it seems that I always have hard finding it via google. Here’s some code to convert raw/hex values into timestamps. (I’ve come across this need in two situations: [1] bind variables recorded in trace files or sql monitor and [2] hi/low values in column statistics.)
As a query, without creating any objects in the database:
Recent comments
3 years 5 weeks ago
3 years 17 weeks ago
3 years 21 weeks ago
3 years 22 weeks ago
3 years 27 weeks ago
3 years 48 weeks ago
4 years 16 weeks ago
4 years 46 weeks ago
5 years 30 weeks ago
5 years 31 weeks ago