Search

Top 60 Oracle Blogs

Recent comments

June 2014

Fame at last!

Someone, who shall remain nameless (Eric Yen), just pointed me to this example of creating a credential in the 12c manual.

Looks kind-of familiar! :)

I mentioned in my 12c scheduler article that the DBMS_CREDENTIAL package is used in pretty much the same way as the old credential procedures in the DBMS_SCHEDULER package. So much so that someone has taken the example from my 11g article (here), replaced the package name and put it in the 12c documentation.

Common Users & SYSDBA with #Oracle 12c Multitenancy

A 12c multitenant database introduces the new concept of local users and common users. This article shows simple use cases why DBAs may want to create common users – in contrast to the common users that are created automatically, like SYS, SYSTEM, MDSYS etc.

A typical requirement is to have a superuser other than SYS, but with the same power. Like the common user C##_SYS in the picture below.

Or suppose we have many pluggable databases (PDBs) and different superusers responsible for different PDBs like C##_ADMIN1 and C##_ADMIN2:

Oracle’s SQL Tuning pack, part II

#555555;">

Installation of ASH Analytics with EM12c Rel4

If you have the Diagnostics/Tuning Management Packs and EM12c, you should be installing ASH Analytics to get the full benefit of optimization data that is available via the AWR and ASH. ASH Analytics is the future of the Top Activity view and I’ve written a number of posts on the value of the ASH Analytics product.

NYOUG Summer 2014 Conference: Understanding Oracle Locking Internals

Thank you for all those who attended my session: Understanding Oracle Locking Internals today at the Summer 2014 conference of New York Oracle User Group in Manhattan. You can download the presentation and the scripts I used in demos here.

Presentation: http://www.proligence.com/pres/nyoug14/understanding_oracle_locking.pdf
Scripts: http://www.proligence.com/pres/nyoug14/understanding_oracle_locking_scripts.zip

As always, your feedback will be highly appreciated.

Hadoop as a Cloud

The idea of clouds “meeting" big data or big data "living in" clouds isn’t simply marketing hype. Because big data followed so closely on the trend of cloud computing, both customers and vendors still struggle to understand the differences from their enterprise-centric perspectives. Everyone assumes that Hadoop can work in conventional clouds as easily as […]

More Than Just a Lake

Data lakes, like legacy storage arrays, are passive. They only hold data. Hadoop is an active reservoir, not a passive data lake. HDFS is a computational file system that can digest, filter and analyze any data in the reservoir, not just store it. HDFS is the only economically sustainable, computational file system in existence. Some […]

Hadoop is Not Merely a SQL EDW Platform

I was recently invited to speak about big data at the Rocky Mountain Oracle User's Group. I presentedto Oracle professionals who are faced with an onslaught of hype and mythology regarding Big Data in generaland Hadoop in particular. Most of the audience was familiar with the difficulty of attempting to engineer even Modest Data on […]

20th Century Stacks vs. 21st Century Stacks

In the 1960s, Bank of America and IBM built one of the first credit card processing systems. Although those early mainframes processed just a fraction of the data compared to that of eBay or Amazon, the engineering was complex for the day. Once credit cards became popular, processing systems had to be built to handle […]

Oracle Enterprise Manager 12c Release 4

Funny how there is no mention of the name “Cloud Control” in the latest announcement. Does that mean we are back to calling it just Enterprise Manager again?

Anyway, the latest instalment of EM has been born. The downloads are available on OTN.

I’ve got mine downloading. The past few versions have been relatively easy to install on a clean system, but the upgrades have been a total pig. I’m hoping this one is going to be significantly easier… Sucker… :)

Cheers

Tim…