Search

OakieTags

Who's online

There are currently 0 users and 46 guests online.

Recent comments

Affiliations

Oakies Blog Aggregator

Sun Coast Oracle User Group June 24th Sessions Materials

Thank you to all those who stayed back quite late that night for my two presentations. I hope you found it informative and useful.

As promised, you can download the session materials here.

As always, I will be honored to hear from you.

AWR thoughts

It’s been a week since my last posting - so I thought I’d better contribute something to the community before my name gets lost in the mists of time.

I don’t have an article ready for publication, but some extracts from an AWR report appeared on the OTN database forum a few days ago, and I’ve made a few comments on what we’ve been given so far (with a warning that I might not have time to follow up on any further feedback). I tried to write my comments in a way that modelled the way I scanned (or would have scanned) through the reporting – noting things that caught my attention, listing some of the guesses and assumptions I made as I went along.  I hope it gives some indication of a pattern of thinking when dealing with a previously unseen AWR report.

 

 

AWR thoughts

It’s been a week since my last posting - so I thought I’d better contribute something to the community before my name gets lost in the mists of time.

I don’t have an article ready for publication, but some extracts from an AWR report appeared on the OTN database forum a few days ago, and I’ve made a few comments on what we’ve been given so far (with a warning that I might not have time to follow up on any further feedback). I tried to write my comments in a way that modelled the way I scanned (or would have scanned) through the reporting – noting things that caught my attention, listing some of the guesses and assumptions I made as I went along.  I hope it gives some indication of a pattern of thinking when dealing with a previously unseen AWR report.

 

 

Automatic Diagnostics Repository (ADR) in Oracle Database 12c

There’s a neat little change to the Automatic Diagnostics Repository (ADR) in Oracle 12c. You can now track DDL operations and some of the messages that would have formerly gone to the alert log and trace files are now written to the debug log. This should thin out some of the crap from the alert log hopefully. Not surprisingly, ADRCI has had a minor tweak so you can report this stuff.

You can see what I wrote about it here:

Of course, the day-to-day usage remains the same, as discussed here:

Cheers

Tim…


Automatic Diagnostics Repository (ADR) in Oracle Database 12c was first posted on June 25, 2014 at 3:08 pm.
©2012 "The ORACLE-BASE Blog". Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement.

Number one bottleneck in IT ?

#000000;">“Any improvement not made at the constraint is an illusion.” – Gene Kim paraphrasing “The Theory of Constraints”

What is the constraint in IT?

The constraints in IT are

  1. Provisioning environments for development
  2. Setting up test and QA environments
  3. Architecting development to facilitate easy changes in code
  4. Development speed
  5. Product management  input

Meaning, until the first constraint is eliminated it is pointless and even potentially counter productive to tune the following constraint.

The first constraint for most organizations to tackle is thus the speed and agility with which they can provision environments for development.

The above list comes from Gene Kim the author of The Phoenix Project. He  lays out these top constraints  in this interview Gene Kim interview.

In the interview Gene Kim talks about what causes the biggest delays in application development in IT.  He says, starting around minute 6:45

“I’ve been trained in the theory of constraints and one of the things I think is so powerful is the notion of the constraint in the value stream. What is so provocative about that notion is that any improvement not made at the constraint is an illusion. If you fix something before the constraint you end up with more work piled up in front of the constraint. If you fix something after the constraint you will always be starved for work.

In most transformations, if you look at what’s really impeding flow, the fast flow of features, from development to operations to the customer, it’s typically IT operations.

Operations can never deliver environments upon demand.

People have to wait months or quarters to get a test environment. When that happens terrible things happen. People actually horde environments. They invite people to their teams because the know they have reputation for having a cluster of test environments so people end up testing on environments that are years old which doesn’t actually achieve the goal.

One of the most powerful things that organizations can do is to enable development and testing to get environment they need when they need it.

After that it’s about test setup time and test run time one that is eliminated .

After that is eliminated it’s typically about architecture. How do we make changes that don’t require 15 other changes simultaneously. How do we create more looser couplings.

Then after that the constraint moves into development or product management.

It is a very technical cultural obstacle is just making available environments for people who need it whether it’s production development or tests.”

 MP900382632

 

Integrating PFCLScan and Creating SQL Reports

We were asked by a customer whether PFCLScan can generate SQL reports instead of the normal HTML, PDF, MS Word reports so that they could potentially scan all of the databases in their estate and then insert either high level....[Read More]

Posted by Pete On 25/06/14 At 09:41 AM

Rolling upgrads using logical standby database.

Couple of weeks ago there was a Twitter discussion started by Martin Bach (@MartinDBA) about cases for logical standby implementation. A rolling upgrade was mentioned by Tim Gorman (@timothyjgormanas) as one of potential recommendations for using this rare use product. I have been involved in such project in the past and I prepared an instruction and did quite large number of rolling upgrades from 11.1 into 11.2.

There are couple of my “gotchas”


  • Support for data types – make sure that all data type in your application are supported by logical standby
  • Support for Oracle features like reference partitioning or compression
  • Logging all apply related errors during a logical standby phase
  • Keep DML operations running on big dataset to minimum – keep in mind that update tab1 set col1=2 will be translated into separated update for every row in table and you really want to avoid it.
  • Compatible parameter – if you are using flashback to rollback changes you can change compatible parameter with restore points

If you checked that all your types and features are supported this is a list of advantage you can get from rolling upgrade:

  • Keep your application downtime low – in reality we have an average downtime around 3 min (including additional restart of instance to remove a restore points and change compatible parameter)
  • If you have a problems with upgrading you can rollback it quite easy and revert logical standby into physical
  • Your upgrade script can work longer as your primary database is still running
  • After upgrade you can have a read only access to your production database for tests if needed

There are two good Oracle white papers about Rolling upgrades :

First one is longer and required more work but also can give you more control over the process. Second one is more automated and easier but you have less control over switchover time.

This is I hope a first post from rolling upgrade series – in next one I will post more details about manual process.

Regards,
Marcin

It’s about : Data Supply Chain

#222222;">There have been a number of questions coming my way about Delphix versus snapshot technologies. The capabilities of Delphix can be differentiated from snapshot technologies through the following hierarchy:

#222222;">Screen Shot 2014-06-24 at 6.57.54 AM

#222222;">
  1. Data Supply Chain (Delphix approach to data management)
  2. Data Virtualization (end-to-end collection and provisioning of thin clones)
  3. Thin Cloning
  4. Storage Snapshots
On top we have the most powerful and advanced data management features that enable fast, easy, secure, audit-able data flow through organizations.
Screen Shot 2014-05-21 at 8.08.47 AM
Data Supply Chain is built on top of other technologies. On the bottom we have the minimal building blocks starting with storage snapshots.  Storage snapshots can be used to make “thin clone” databases. Storage snapshots have been around for nearly 2 decades but have seen little usage for database thin cloning due to the technical and managerial hurdles. Part of the difficulty with creating thin clones is that thin cloning requires work by multiple people and/or teams such as as DBAs, system admins, storage admins etc it takes to create the thin clones.
Screen Shot 2014-05-21 at 8.08.47 AM
In order to overcome the obstacles creating thin clones, all the steps can be optimized and automated in a process called data virtualization.
Screen Shot 2014-05-21 at 8.08.47 AM
Data Virtualization is just the first step in automation. The next step is adding all the processes, functionality and control to manage the virtual data which is Data Supply Chain.
Screen Shot 2014-05-21 at 8.08.47 AM
File system snapshots such as ZFS address the very bottom of the hierarchy, that is, they only manage storage snapshots. They have no automated thin cloning of databases. Without automated thin cloning of databases there is no end-to-end processing of data from source to thin cloned target i.e. data virtualization. With out data virtualization there is no data supply chain.
#500050;">
Screen Shot 2014-05-21 at 8.08.47 AM
Data Supply Chain  features, all of which are encompassed by Delphix, include
Screen Shot 2014-05-21 at 8.08.47 AM
  • Security
    • Masking
    • Chain of custody
  • Self Service
    • Login and Roles
    • Restrictions
  • Developer
    • Data Versioning and Branching
    • Refresh, Rollback
  • Audit
    • Live Archive
  • Modernization
    • Unix to Linux conversion
    • Data Center migration
    • Federated data cloning
    • Consolidation

Data Supply Chain re-invents data management and provisioning by virtualizing, governing, and delivering data on demand.

Most businesses manage data delivery with manual, ad hoc processes: users file change requests, then wait for DBAs, systems administrators, and storage administrators to push data from system to system, bogging down production applications, networks, and target systems with long load times. Data delays cost businesses billions a year in lost productivity and low utilization of systems and software resources.

As a result, there  an enormous opportunity to optimize data management. Data management can be optimized with  data supply chain yielding significant business impact:

  • Drive revenue, competitive differentiation with faster application time to market
  • Enable faster growth via better release management of enterprise applications
  • Improve customer intimacy, upsell, cross-sell with faster, more flexible analytics
  • Free budget for innovation by reducing IT maintenance costs
  • Reduce compliance risk through better governance, data security.

Businesses need to manage data as a strategic asset across their operations, applying the same rigor as supply chain optimization for manufacturing companies.

Data Supply Chain Transformation Process with Delphix

Delphix applies a three-step process to transform the data supply chain:

  • Analyze: survey systems, processes, teams across data supply chains
  • Transform: virtualize, automate data delivery with centralized governance
  • Leverage: drive business value via new data products, process optimization

Businesses typically manage multiple data supply chains simultaneously, all of which are targets for data chain optimization:

  • Compliance retention, reporting
  • Modernization, migration projects
  • Application projects and development
  • BI, analytics
  • Data protection.

Delphix re-invents the data supply chain with its Virtual Data Platform:

  • Install data engines in hours across all repositories, locations (including cloud)
  • Connect: non-disruptively sync data across sites, systems, architectures
  • Control: secure data, track release versions, preserve and prove data history
  • Deploy: automatically launch virtual data environments in 10x less space, time
  • Leverage data with self service refresh, reset, branching, bookmarks, integration.

According to an IDC study, Delphix pays for itself in IT savings, with an average payback of 4.3 months.

 

euro-163475_640

Parallel Execution Skew - Addressing Skew Using Manual Rewrites

This is just a short note that the next part of the mini series about Parallel Execution skew has been published at AllThingsOracle.com.

After having shown in the previous instalment of the series that Oracle 12c added a new feature that can deal with Parallel Execution skew (at present in a limited number of scenarios) I now demonstrate in that part how the problem can be addressed using manual query rewrites, in particular the probably not so commonly known technique of redistributing popular values using an additional re-mapping table.