Search

Top 60 Oracle Blogs

Recent comments

migration

Some myths about PostgreSQL vs. Oracle

By Franck Pachot

.
I originally wrote this as a comment on the following post that you may find on internet:
https://www.2ndquadrant.com/en/blog/oracle-to-postgresql-reasons-to-migrate/
but my comment was not published (many links in it… I suppose it has been flagged as spam?) so I put it there.

You should never take any decision on what you read on the internet without verifying. It is totally valid to consider a move to Open Source databases, but doing it without good understanding is a risk for your migration project success.

DataPump migration to SECUREFILE

These tweets somewhat befuddle me.

image

Because to be honest, if I no matter if I say “Yes” or “No” , and then for someone to take that response at face value…well…whilst I’m flattered in the level of trust, I’m also a little saddened because I think our community is best served by everyone always insisting on evidence for claims.

Crossplatform transportable tablespaces - part 2

It took some time since I wrote a first post about TTS migration but I finished that project literally hours before my summer break. Now after couple of days while I enjoyed thermal waters and good wine of Hungary it's time to write next post.

As I described in my previous post I had to migrate database from HP-UX into Linux and also upgrade it from 10g into 12c. This time it was only PoC but my goal was to minimize downtime of production database.

Source database datasheet:

Crossplatform transportable tablespaces - part 2

It took some time since I wrote a first post about TTS migration but I finished that project literally hours before my summer break. Now after couple of days while I enjoyed thermal waters and good wine of Hungary it's time to write next post.

As I described in my previous post I had to migrate database from HP-UX into Linux and also upgrade it from 10g into 12c. This time it was only PoC but my goal was to minimize downtime of production database.

Source database datasheet:

NLS, Part Deux

A guest post today, by Brian Ledbetter, a co-worker at Agilex:

On a customer’s database, we ran across a table that would not migrate.  It was admittedly a log table, containing long chunks of HTTP header data, but whenever we tried importing it into our 11gR2 database, we ended up getting:

IMP-00058: ORACLE error 1461 encountered
ORA-01461: can bind a LONG value only for insert into a LONG column

After looking at the table structure, the first thing we noticed was that there was a VARCHAR2(4000) column in the table.  Considering that this column was already the maximum size (in bytes) for a CHAR-based data type, it became the focus of our attention.

Looking online for solutions, we found references [1] suggesting that Oracle was implicitly converting this column to a VARCHAR2(4000 CHAR) type, creating a column that can contain up to 4 bytes per character.[2]  Because this overflows the 4000 byte limit on column length, Oracle then attempted to implicitly convert the datatype to a LONG VARCHAR2, which is apparently deprecated in 11gR2.[3]  (We’re not sure why Oracle is still trying to make this conversion, if that’s the case.)

Anyway, we tried precreating the table with a CLOB datatype, and that didn’t work either, so as a workaround, we created a copy of the table with the data trimmed to 1000 characters (leaving plenty of room after UTF8 conversion):

create tabname_migtmp as select col1, col2, substr(col3,1,1000) col3 from tabname;

We then used exp/imp to copy tabname_migtmp over to the 11gR2 server, and inserted the data from it into the final location.

insert into tabname select * from tabname_migtmp;

drop table tabname_migtmp;

[1] http://forums.oracle.com/forums/thread.jspa?threadID=1038043

[2] http://stackoverflow.com/questions/5230346/char-semantics-and-ora-01461

[3] http://forums.oracle.com/forums/thread.jspa?threadID=2230351

See Also: Technote 444171.1, https://supporthtml.oracle.com/ep/faces/secure/km/DocumentDisplay.jspx?id=444171.1