What is one of the most daunting issues in IT departments now? What kind of database issue can end people up in orange jump suits and yet how to avoid clearly avoid issues is unclear. Regulatory compliance. Regulatory compliance is driving many Fortune 1000 and world wide corporations IT departments over the edge of their abilities. With regulatory compliance, how much data do you have to keep, how long do you have to keep it, how fast do you have to be able to access it and what technology do you use to maintain it?
Fortune 1000 companies in order to meet compliance demands typically must collect, model and report on data from multiple systems systems, in multiple sites
#555555;">Had a machine running the default database from the Oracle 22.214.171.124 installer. This instance was called o1123.
I then created a second database by hand (with #2970a6;" href="https://github.com/khailey/swingbenchsh/blob/master/shell_scripts/crdb.sh">crdb.sh for #2970a6;" href="http://www.dominicgiles.com/swingbench.html">swingbench and #2970a6;" href="http://www.pythian.com/blog/my-slob-io-testing-index/">slob tests) instead of dbca (maybe that was a mistake) and then I wanted to acccess OEM. The second instance was called SOE60G ( a 60GB swingbench dataset database)
#555555;">First tried to start up dbconsole:
To drive revenue and growth, companies are constantly improving existing applications or creating new ones. This ongoing application development depends upon provisioning environments for developers and QA teams. Once they are up and running in these environments, code development calls for the efficient change management and later deployment of changes. The slower and more costly provisioning and managing the development environments becomes, the more delays and bugs there will be in the applications, and the less revenue the business will generate.
Code management has become straightforward with the use of source control tools such as Git, SVN, and Perforce. Provisioning development environments has been made more efficient by Chef, Puppet, and Jenkins.
How long does your financial close take? How long would you like it to take? How much access do your internal business analysts have to the the financial data?
We’ve worked with a number of companies and taken their financial close down from weeks to a couple of days by using data virtualization and the Delphix appliance. Using Delphix and data virtualization we’ve also thrown the constraints off of limited access to financial data allowing business analysts 24×7 access even in the days ramping up to quarter ends.
Problems we see
#555555;">OEM just seems to have too many brittle esoteric configuration files and process dependencies. Ideally I just want to connect with system/password and go. Is that too simple to ask for?
#555555;">Today I tried out OEM and got the general broken page:
#555555;">And my first reaction was just to give up and move on, but then I noticed the error message sounded some what simple:
#555555;">ORA-28001: the password has expired (DBD ERROR: OCISessionBegin)
#555555;">Hmm, maybe this *is* easily fixable. Well guess again. Luckily someone has well documented the fix
#555555;">How does the STA work in 11gR2 with the query from “Oracle’s SQL Tuning Pack – part II″ ?
#555555;">In Part II, the STA in 10g proposed a new profile for the query but that profile actually caused the query to run slower. Quetion is, in 11gR2 does the STA do better?
#555555;">Below I ran the query load, identied the query, submitted to STA and STA spent 30 minutes burning CPU trying to tune the query and finally ended with an error that a better plan could not be found.
The idea of clouds “meeting" big data or big data "living in" clouds isn’t simply marketing hype. Because big data followed so closely on the trend of cloud computing, both customers and vendors still struggle to understand the differences from their enterprise-centric perspectives. Everyone assumes that Hadoop can work in conventional clouds as easily as […]
Data lakes, like legacy storage arrays, are passive. They only hold data. Hadoop is an active reservoir, not a passive data lake. HDFS is a computational file system that can digest, filter and analyze any data in the reservoir, not just store it. HDFS is the only economically sustainable, computational file system in existence. Some […]
I was recently invited to speak about big data at the Rocky Mountain Oracle User's Group. I presentedto Oracle professionals who are faced with an onslaught of hype and mythology regarding Big Data in generaland Hadoop in particular. Most of the audience was familiar with the difficulty of attempting to engineer even Modest Data on […]