Feed aggregator

Productivity with choice

Omar Tazi - Wed, 2007-02-14 16:39
This article published in the Oracle Magazine March/April 2007 issue explains well our tooling strategy and why Oracle is committed to both JDeveloper and Eclipse to increase our customers' productivity no matter what development platform they end up using. Thanks to Rich Schwerin for putting it together:
http://www.oracle.com/technology/oramag/
oracle/07-mar/o27opensource.html

Wikipedia and Oracle Data Mining

Marcos Campos - Mon, 2007-02-12 15:44
Wikipedia has a nice page on Oracle Data Mining (link). It provides a good overview of the features and history of the product. Here is a snippet of the text:Oracle Data Mining (ODM) is a software product distributed as an option to Oracle Corporation's Relational Database Management System (RDBMS) Enterprise Edition (EE). This product supports a collection of data mining and data analysis Marcoshttp://www.blogger.com/profile/14756167848125664628noreply@blogger.com0
Categories: BI & Warehousing

New Oracle Statistical Functions Page

Marcos Campos - Mon, 2007-02-12 14:55
OTN has a new page (link) describing the statistical functions in the Oracle 10g Database. These functions are available in all versions of the database at no extra cost. Features include:Descriptive statisticsHypothesis testingCorrelations analysis (parametric and nonparametric)Ranking functionsCross Tabulations with Chi-square statisticsLinear regressionANOVATest Distribution fitWindow Marcoshttp://www.blogger.com/profile/14756167848125664628noreply@blogger.com2
Categories: BI & Warehousing

Welcome BIWA

Marcos Campos - Mon, 2007-02-12 14:37
The Business Intelligence, Warehousing and Analytics Special Interest Group (BIWA SIG, BIWA for short) has been recently created. Although it counts with a strong participation of Oracle employees, BIWA is an independent organization from Oracle.BIWA is a community in the making. It provides a number of benefits to its members (membership is free):Get the latest information about Business Marcoshttp://www.blogger.com/profile/14756167848125664628noreply@blogger.com2
Categories: BI & Warehousing

Preventing record deletion

Stephen Booth - Sun, 2007-02-11 18:33
This entry is partly an aide memoire for me, partly to try to get something that has been keeping me awake for the past hour or so out of my brain so I can sleep and partly in the hope that someone can suggest a way forward.A quick bit of background. Until April 06 most of our major systems were looked after by an external Faccilities Management company. In April 06 IT was kind of outsourced toStephen Boothhttps://plus.google.com/107526053475064059763noreply@blogger.com7

Getting Ready for Windows Vista

Christian Shay - Mon, 2007-02-05 05:24
The Oracle Database on Windows Vista Statement of Direction is now up on OTN and it provides time frames for the planned releases of Oracle Database on Vista.

EDIT (5/4/07): The 32-bit Oracle Database 10.2.0.3 as well as Oracle Database XE are now certified on Windows Vista. The 32-bit Oracle Client is now packaged up in a simple single install so you don't need to patch your way from 10.2.0.1, nor will you need access to Metalink. Read my recent blog entry for more details. The text in the blog entry below was an unsupported "hack" that you won't need to use anymore. If you do decide to apply the 10.2.0.3 patch instead of downloading the whole 10.2.0.3 client from OTN, please follow the installation instructions in the 10.2.0.3 patch release instead of the instructions below.

Prior to those release dates, many developers who support Windows clients (.NET, ODBC, OLEDB, OO4O etc) will want to get a head start and do some testing of their applications. Unfortunately, the 10.2.0.1 Oracle Installer (the one on all the CDs) will not run properly on Vista. But there is a way you can get around this. It's a bit of convoluted hack, and none of this is supported officially, but it should help you get started testing.

The gist of it is: The installer that is included with the 10.2.0.3 patchset DOES work on Vista and must be used to first install the 10.2.0.1 Oracle Client software, and then used to upgrade to the 10.2.0.3 Oracle Client. The upgrade to 10.2.0.3 is required because there are a few other Vista related fixes in the 10.2.0.3 patchset.

The following instructions explain how to do this:

1) Obtain 10.2.0.1 client CD or zip file (from OTN)
2) Download the 10.2.0.3 Oracle patchset for 32-bit Windows from Oracle MetaLink and unzip it. (https://metalink.oracle.com/)
3) Change directory to the 10.2.0.3 Disk1\install directory, open oraparam.ini using notepad (or any editor) and comment out the line "SOURCE=../stage/products.xml" by adding a # to the beginning of this line.
4) Run setup.exe from 10.2.0.3 Disk1 directory, point to the 10.2.0.1 CD location's products.xml file and proceed with install of 10.2.0.1.
5) The pre-requisite check will fail. This is expected. Click in the boxes next to the errors. This will change them to say "User Verified." Then click "Next" to continue.
6) Once 10.2.0.1 has been installed as above, uncomment the "SOURCE=../stage/products.xml" line in Disk1\install\oraparam.ini file in 10.2.0.3 patchset
7) If you wish to test "ODP.NET for .NET 2.0 version 10.2.0.2.20", you will need to download and install ODAC from OTN at this point, since only "ODP.NET for .NET 1.1" is included in the 10.2.0.3 patch.
8) From the 10.2.0.3 directory, run Disk1\setup.exe and upgrade the 10.2.0.1 install to 10.2.0.3

That's it! Er, well of course that's just the start. Now you proceed to test your application!

I can sense a veritable storm of blogging coming up in my near future, so stay tuned!

Oracle Scene - And finally..

Neil Jarvis - Wed, 2007-01-31 03:38
I've become the technical deputy editor of Oracle Scene and we will be holding the first editorial meeting 5th Feb 2007. If you want to contribute any acticles/ comments please let me know.

In the meantime I am going to include my first 'And Finally...' acticale here for your 'friendly' comments.

For my first ‘And finally….’ I would like to talk about the UKOUG Special Interest Groups, but first the introduction. Many of you have already met me as I have been involved in the user group for over 6 years now, but for those who haven’t my name is Neil Jarvis and I’ve been working with the Oracle RDBMS for over 18 years, first as a programmer and then in 1998 as a Database Administrator. In 1999 I attended my first user group conference in Birmingham and in 2000 became a deputy chair of the UNIX SIG under the auspice leadership of David Kurtz. Since 2000 I have presented a few technical papers at the UNIX and DBMS SIGs, help arrange the agenda and chaired some of the UNIX SIGs. I am also on the committee for the forthcoming Northern SIG which will be held in April, somewhere north of Watford (watch this space for updates).

In the last 10 years or so I have been involved/ employed by many different organisations ranging from financial to local authority and retail, all of which held UKOUG membership. In over half of these cases I was surprised to see they were not taking full advantage of their membership. Membership of UKOUG entitles you to access of over 120 UKOUG events all of which are free to the first person and a nominal charge to subsequent attendees. You also have the same access to the annual conference, currently being held in Birmingham. This is a four day event with at least 5 streams running all day. Membership also gives you access to the online resource library which holds, amongst other things, most of the presentations not just at the SIGs but also the conference. The office also sends out an e-bulletin fortnightly with the latest news and reminders for forthcoming events. If this is not enough you also receive 30% off Oracle books and this magazine containing articles on not only business but technical and non-technical areas.

With all these benefits I would like to focus on the Special Interest Groups. The agendas for the meetings have to cater for the views of a large audience and as such you may feel going couldn’t justify taking the time out of the office, as some of the material may not be relevant. In my experience the opposite is true. Whilst you may not feel a presentation of ASM may be relevant for your company right now, the technology will eventually catch you up, and then at that stage your company will have to pay for a course, and ironically, you’ll have to take the time out of the office. But if you attended your free SIGs, that knowledge will be there, in the back of your mind, ready to be accessed. So the next time a SIG comes around don’t think, will this SIG be relevant for my company now? think, are the topics relevant for my company in the future? You must remember your committee will be thinking the same questions as to the appropriateness of the subjects.

And finally, I would like to personally thank David Krutz for all his time and effort he has put into the UNIX SIG over the past 6 years. Without him the UNIX SIG wouldn’t be as successful as it as been. I do hope him all the best in his directorship in UKOUG and that the present committee of UNIX SIG continues in the good work David performed.

Deployment Options with XML Publisher

Vidya Bala - Mon, 2007-01-29 16:25
This post will discuss the different ways XML Publisher can be used along with Microsoft Word Template builder to generate Reports.

XML Publisher (also called BI Publisher) has the following Deployment options

1) Oracle Applications (will not be discussed in this post)
2) XML Publisher Desktop Edition
Installs XML Publisher Template Builder in Microsoft word that helps you build templates for your Reports. The templates can be stored as rtf files. Following are the Source Data Options using Template Builder in word
a)XML File
b)SQL Query , needs connection information to source database
c)XML Schema
d)XML generated by Siebel Analytics Answers (I have not been able to get this to work , it may be something that will be available in the next releases and more easily integrated in the next few releases of Siebel Analytics)
3) XML Publisher Enterprise Edition
a)provides a web based console that can be used to publish multiple reports
b)XML Publisher enables you to define your reports and separate the data from the layout of the reports .
c)XML Publisher can run on any J2EE compliant Application Server

XML Publisher Desktop Edition:

If you have installed the Desktop Edition


Template Builder Options will be available from MS Word menu.
Template Builder – Data – Load XML Data (XML File) , XML Schema , Report wizard (lets you give database connect information and the sql to extract the data)
Below will list a quick example on how the Report Wizard can be used (connecting to the hr schema to get a list of Departments)


give the database connect information and sql query for the data that needs to be retrieved.We will choose the Default Template Layout in this example.
preview the Report and then save the RTF file (in this example we save the RTF file as hr_departments.rtfXML Publisher Enterprise Edition

As mentioned XML Publisher enterprise edition can run on any J2EE compliant Application Server.

The Admin and Reports directories are available under
Install_dir/xmlpserver

The following files have the port numbers used by the application.
HTTP Port 15101 install_dir/default-web-site.xml
RMI Port 15111 install_dir/rmi.xml
JMS Port 15121 install_dir/jms.xml

Default URL to access XML Publisher Application http://host:15101/xmlpserver (default username/pwd admin/admin)

create new folder and new Report in the corresponding folder.
Edit the Report to define the following properties:
i)datasource for the Report (new datasources can be created in the Admin window)
ii)Data Model: Define the sql query

iii)New List of Values: If the Report uses LOV’s

iv)Parameters: if any parameters are needed for the Report

v)Layouts: create a new template called hr_departments
upload hr_departments.rtf and tie it to the hr_departments template.View the results
you can see that the template is chosen by default and the different output formats available. The above is a very simple illustration of how XML publisher will let your users design their own Reports(and manage changes to design templates of reports) while IT can focus on the data needed for the Reports and other important tasks.



Categories: Development

An ADF Faces and XML Publisher Success Story

Brenden Anstey - Sun, 2007-01-28 23:43
Deciding on an output mechanism for the application
After months of developing an application using ADF Faces and ADF Business components it came time to solve the last of my integration and design issues: Integrating a reporting/output mechanism. Early in the project I considered the option of using Oracle Reports or doing (redoing) printable versions of the entire entry form. With over 40 sections to the online application form, neither of these options seemed desirable or suitable.

At a recent conference XML Publisher generated quite a lot of interest so I thought I would download it and see what the hype was about. It didn't take long to work out that XML Publisher was going to be the tool of choice for my application. The original Word documents that the application was built from could be used with the data plugged straight in to where it needed to go. One of the big gains was not having to worry about formatting, this alone saved a huge amount of time.

Integrating XML Publisher with JDeveloper
Deepak Vohra has produced an article on OTN called Integrating Oracle XML Publisher with Oracle JDeveloper 10g which provided examples of using the XML Publisher API's in a simple Java Class. To start of with I created a small Java project based on one class which I executed from its Main method. This allowed me to get all the class paths and libraries sorted out and have me a good understanding of the XML Publisher Java API's. I was able to produce and merge PDF documents including page numbering in a very short space of time. I had to set enough memory for the XMP Publisher API's to overcome an out of memory error (Add "-Xmx256m"to your project run properties). The next step was to integrate this into a sandbox J2EE project and start writing to the browser response stream rather than disk. This presented a few challenges immediately, mostly around the location of the Data Template and XSL files which defined the output to be produced. In fact any interaction with the file system would prove to be a problem due to the relative execution changing to the location of the OC4J and not the class that is running. The other key problems were getting a pooled connection to the database and coding any file system operations to run in the embedded OC4J (XP) and target 10.1.3 middle tier (Linux).

Key design problems and their solutions
Getting a pooled connection for use by XML Publisher was solved by exposing a client method from the Application Module which returned a database connection from the connection pool. Getting the connection is described here in Steve Muench's blog.

The next design problem was where to actually call the XML Publisher API's to do all of the processing. The answer in short was a Managed Bean. The bean references the binding layer to get its parameters and is called by a JSP, which sets the response stream for the XML Publisher PDFDocMerger to write its PDF output to.

As for the interaction with the file system, I used a system properties table within the database which contained the various directories for the XML Data Templates and XSL style sheets for the output. I coded an AM method which used System.getProperties().getProperty("os.name"); to work out what OS the OC4J was running under and get the property for that Operating System eg. xmlp.windows.template.dir would get the full path on Windows for the XML Publisher template directory. (not quite write once - run anywhere, but close enough)

Summary
XML Publisher has been a huge win for this particular project and has successfully plugged a huge gap in the output mechanism for the application. The API's are comprehensive and every time looked to see if it could do something, the answer seemed to be yes and more! XML Publisher will definately be my preferred reporting tool for future ADF based projects.

REPOST: Pivot and Crosstab Queries

Robert Vollman - Thu, 2007-01-25 15:00
Here is another advanced concept that will come in useful when solving Oracle problems. Imagine you're trying to create a result set where the rows need to be columns, or vice versa. In essence, you need to "pivot" rows into columns, or vice versa. That is a very common requirement, and this is where you need to look at a pivot (or crosstab) query to get the job done.As always, when you want Robert Vollmanhttp://www.blogger.com/profile/08275044623767553681noreply@blogger.com3

OracleXml putxml limitations

Vidya Bala - Wed, 2007-01-24 10:29
OracleXml putxml limitations:

Example1:xml file(test.xml):

ROWSET
ROW num="2"
ID 15 /ID
/ROW
/ROWSET

(using the java API for XDK) the below command
java OracleXML putXML -user "vidya/vidya" -ignorecase –filename "test.xml" "emp"

will load one row into table emp.

Example2:Xml file with namespaces (test1.xml):

ns:EMP
ns:ITEM
ns:ID 2 /ns:ID
/ns:ITEM
/ns:EMP

(using the java API for XDK) the below command
java OracleXML putXML -user "vidya/vidya" -ignorecase –filename "test1.xml" "emp"

if “OracleXML” cannot identifiy the ROWTAG the above command can be modified as below

java OracleXML putXML -user "vidya/vidya" -ignorecase –rowtag “ITEM” –filename "test1.xml" "emp"

note the above will error as OracleXML doesn’t seem to be working on
a file with namespaces. The only way to probably get around this is by applying a stylesheet.
Categories: Development

What hardware should I buy?

Edward Whalen - Wed, 2007-01-24 10:28

I recently had one of my blog readers ask me for some advice on some new hardware that he was going purchase for running Oracle on Windows. It is difficult to give specific advice on what to purchase, but I can provide a few general guidelines.

  1. Get something that is expandable. If you don’t need 4 CPUs now you can get a system capable of supporting 4 CPUs, but only purchase one or two. Make sure that you can add sufficient memory as necessary. Start with 2 or 4 GB but make sure that there are free slots in case you need to add more.
  2. If you will be running Oracle 10g, absolutely go 64-bit. Any recent Xeon or Opteron processor supports 64-bit Windows. The 64-bit version of Windows 2003 works great and is priced similar to the 32-bit version.
  3. Get a name brand. HP, IBM, Dell, etc. Get something that is supported by the manufacturer.
  4. If possible, separate the application tier from the database tier.
  5. Get sufficient disk drives. I'm not saying that you need to start with 8, but you need enough so that IO performance is not a problem. How many do you need? I can't tell you without knowing more about the database and application, etc.
  6. Absolutely use a RAID controller and disk mirroring (RAID 1). If you lose your data, you are out of business.
  7. In larger systems I recommend separating the log drives and data drives, since the loss of one of the two is a recoverable failure. The loss of both is catastrophic.
  8. If you don't already have one, get something to back up your database with. This can be tape, DVD, etc..
  9. Get trained, get some books, etc.. I'm trying to convince Oracle Press to let me do an Oracle on Windows book.

I hope that this is helpful. If you have any comments or a suggestion for a future topic, drop me an email at ewhalen@perftuning.com.

Oracle Text 9i Bug searching XML Data

Vidya Bala - Wed, 2007-01-24 10:25
My last post I discussed that there was a 9i (9.2.0.6) bug using section_group_type auto section group . This group type automatically creates a zone section for each start-tag/end-tag pair in a XML document. The section names derived from XML tags are case sensitive as in XML.
Searches with auto_section_group work in 9.2.0.6 but not for attributes within a tag. For example
Book title="A" author="B"
attributes title and author cannot be searched using auto_section_group section type in 9.2.0.6. The bug has been fixed in 9.2.0.8
Categories: Development

Oracle SOA Suite 10.1.3.1 review - Oracle sows the seeds for SOA

Clemens Utschig - Wed, 2007-01-24 02:47
Infoworld published a review of our SOA Suite here today.

Here is one quote

My experience with the Oracle SOA Suite revealed a top-notch toolset well-culled from a variety of sources without much sacrifice to aptitude or usability. When it comes to message routing and services orchestration, Oracle SOA Suite meets or exceeds most needs for governance, security, insight, and optimization at a price that’s hard to beat.
James R. Borck is senior contributing editor of the InfoWorld Test Center.
-- http://www.infoworld.com/article/07/01/22/04TCoraclesoa_2.html

Emjoy reading,..

OOP 2007 Keynote - Pictures

Clemens Utschig - Wed, 2007-01-24 01:37
As already announced earlier, I had a keynote an OOP this year (on "managing successfull SOA projects") here in munich. Aside from all the really good speakers and a very geeky crowd - snow finally caught even me :D

What a blast, monday evening - and a full room of interested people - something you don't see too often at conferences, where the exhibition starts on tuesday.

Everytime I speak at a conference on this topic I learn something new, from questions after a session as well as from how the crowd reacts. This time the biggest take-away from OOP - the waterfall is dead, at least there :D

Here are 2 pictures from my keynote.



Presenting at ODTUG Kaleidoscope 2007 in Orlando

Clemens Utschig - Sat, 2007-01-20 16:15
As fellow blogger Wilfred van der Deijl wrote on his blog, I got an email too, invitation to speak there - last year I got the slots from my manager (thx Dave :D), this year the are my own - awesome.

I'll present on 3 topics, two for beginners, and one intermediate
  • Managing successful SOA projects, a view beyond agile science
    The session will give an inside view into methodologies applied to govern successful SOA projects.We will discover some of the common challenges by examining the different phases of a project and by splitting the project into different categories - each representing its own set of issues, and how these can be successfully mastered - applying the right tools and the right skills. An insiders view - to make your SOA projects successful.

  • Oracle 11g — Oracle's Next Generation SOA Infrastructure
    which will give a sneak preview on 11g - the all integrated SCA based SOA plattform

  • Advanced Concepts of the BPEL Language
    which is one of the sessions from last year that was very well attended and that I also presented at Oracle Open World last year
Last year the conference was really well attended and Washington DC was great - although insanely hot and humid, which made every walk outside a little bit of a sauna, bath trip.

I hope to see you there - This conference is definetely worth the trip, and hey Daytona is not that bad :D

Heading for Europe - to support one of our key SOA Suite customers

Clemens Utschig - Sat, 2007-01-20 15:47
After a bad abdominal virus this week, that seems to pollute the Bay Area, and got me a sleepless night in the ER at UCSF Med - I am somewhat well again, and on my way to Europe.

I got the chance of a keynote at the OOP 2007 in Munich (Germany), speaking about Managing successfull SOA projects - and the importance of the human/organizational side in SOA projects.

If you happen to be at the OOP or around Munich, I think the entry to the demo pods / show floor is free - so come by and visit me - I'll be there to answer questions around SOA in general, demo the SOA Suite and meet customers.

Also during the week I'll be visiting one of our customers in Germany to make sure their large, and fledged SOA implementation hits the target.
With this in mind, and all the feedback we receive from the fields in our OTN forums I am very happy to see the industry and the market adopting our SOA Suite, the latest edition of, an all over integrated plattform from Governance, Development and Security, to BPEL as well as ESB.

Logical Reads and Orange Trees

Eric S. Emrick - Thu, 2007-01-18 23:36
My previous post was a riddle aimed to challenge us to really think about logical I/O (session logical reads). Usually we think of I/O in terms of OS block(s), memory pages, Oracle blocks, Oracle buffer cache buffers, etc. In Oracle, a logical I/O is neither a measure of the number of buffers visited, nor the number of distinct buffers visited. We could of course craft scenarios yielding these results, but these would be contrived special cases - like an episode of Law and Order only better. Instead, logical I/O is the number of buffer visits required to satisfy your SQL statement. There is clearly a distinction between the number of buffers visited and the number of buffer visits. The distinction lies in the target of the operation being measured: the visits not the buffers. As evidenced in the previous post we can issue a full table scan and perform far more logical I/O operations than there are blocks in the table that precede the high water mark. In this case I was visiting each buffer more than one time gathering up ARRAYSIZE rows per visit.

If I had to gather up 313 oranges from an orchard using a basket that could only hold 25 oranges, then it would take me at least 13 visits to one or more trees to complete the task. Don't count the trees. Count the visits.

Oracle Riddles: What's Missing From This Code?

Eric S. Emrick - Mon, 2007-01-15 18:56
The SQL script below has one line intentionally omitted. The missing statement had a material impact on the performance of the targeted query. I have put diagnostic bookends around the targeted query to show that no DML or DDL has been issued to alter the result. In short, the script inserts 32K rows into a test table. I issue a query requiring a full table scan, run a single statement and rerun the same query - also a full table scan. While the second query returns the same number of rows, it performs far fewer logical I/O operations to achieve the same result set. Review the output from the script. Can you fill in the missing statement? Fictitious bonus points will be awarded for the Oracle scholar that can deduce the precise statement :)

/* Script blog.sql


spool blog.out
set feed on echo on;
select * from v$version;
drop table mytable;
create table mytable (col1 number) tablespace users;
insert into mytable values (3);
commit;
begin
for i in 1..15 loop
insert into mytable select * from mytable;
commit;
end loop;
end;
/
analyze table mytable compute statistics;
select count(*) from mytable;
select blocks from dba_tables where table_name = 'MYTABLE';
select blocks from dba_segments where segment_name = 'MYTABLE';
select index_name from user_indexes where table_name = 'MYTABLE';
set autot traceonly;
select * from mytable;
set autot off;
REM Bookends to show no DML or DDL statement has been executed.
select statistic#, value from v$mystat where statistic# in (4,134);
... missing statement
REM Bookends to show no DML or DDL statement has been executed.
select statistic#, value from v$mystat where statistic# in (4,134);
set autot traceonly;
select * from mytable;
set autot off;
select blocks from dba_tables where table_name = 'MYTABLE';
select blocks from dba_segments where segment_name = 'MYTABLE';
select index_name from user_indexes where table_name = 'MYTABLE';
select count(*) from mytable;
spool off;


End Script blog.sql */

/* Output

oracle@eemrick:SQL> select * from v$version;
BANNER
----------------------------------------------------------------
Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bi
PL/SQL Release 10.2.0.1.0 - Production
CORE 10.2.0.1.0 Production
TNS for Solaris: Version 10.2.0.1.0 - Production
NLSRTL Version 10.2.0.1.0 - Production
5 rows selected.
oracle@eemrick:SQL> drop table mytable;
Table dropped.
oracle@eemrick:SQL> create table mytable (col1 number) tablespace users;
Table created.
oracle@eemrick:SQL> insert into mytable values (3);
1 row created.
oracle@eemrick:SQL> commit;
Commit complete.
oracle@eemrick:SQL> begin
2 for i in 1..15 loop
3 insert into mytable select * from mytable;
4 commit;
5 end loop;
6 end;
7 /
PL/SQL procedure successfully completed.
oracle@eemrick:SQL> analyze table mytable compute statistics;
Table analyzed.
oracle@eemrick:SQL> select count(*) from mytable;
COUNT(*)
----------
32768
1 row selected.
oracle@eemrick:SQL> select blocks from dba_tables where table_name =
'MYTABLE';
BLOCKS
----------
61
1 row selected.
oracle@eemrick:SQL> select blocks from dba_segments where segment_name =
'MYTABLE';
BLOCKS
----------
64
1 row selected.
oracle@eemrick:SQL> select index_name from user_indexes where table_name =
'MYTABLE';
no rows selected
oracle@eemrick:SQL> set autot traceonly;
oracle@eemrick:SQL> select * from mytable;
32768 rows selected.

Execution Plan
----------------------------------------------------------
Plan hash value: 1229213413
-----------------------------------------------------------------------------
Id Operation Name Rows Bytes Cost (%CPU) Time

-----------------------------------------------------------------------------
0 SELECT STATEMENT 32768 65536 26 (4) 00:00:01

1 TABLE ACCESS FULL MYTABLE 32768 65536 26 (4) 00:00:01

-----------------------------------------------------------------------------

Statistics
----------------------------------------------------------
1 recursive calls
0 db block gets
2248 consistent gets
0 physical reads
0 redo size
668925 bytes sent via SQL*Net to client
24492 bytes received via SQL*Net from client
2186 SQL*Net roundtrips to/from client
0 sorts (memory)
0 sorts (disk)
32768 rows processed
oracle@eemrick:SQL> set autot off;
oracle@eemrick:SQL> REM Bookends to show no DML or DDL statement has been
executed.
oracle@eemrick:SQL> select statistic#, value from v$mystat where statistic#
in (4,134);
STATISTIC# VALUE
---------- ----------
4 18 <-- Statistic #4 is user commits

134 461920 <-- Statistic #134 is redo size
2 rows selected.
oracle@eemrick:SQL> ... missing echo of statement
oracle@eemrick:SQL> REM Bookends to show no DML or DDL statement has been
executed.
oracle@eemrick:SQL> select statistic#, value from v$mystat where statistic#
in (4,134);
STATISTIC# VALUE
---------- ----------
4 18
134 461920
2 rows selected.
oracle@eemrick:SQL> set autot traceonly;
oracle@eemrick:SQL> select * from mytable;
32768 rows selected.

Execution Plan
----------------------------------------------------------
Plan hash value: 1229213413
-----------------------------------------------------------------------------
Id Operation Name Rows Bytes Cost (%CPU) Time

-----------------------------------------------------------------------------
0 SELECT STATEMENT 32768 65536 26 (4) 00:00:01

1 TABLE ACCESS FULL MYTABLE 32768 65536 26 (4) 00:00:01

-----------------------------------------------------------------------------

Statistics
----------------------------------------------------------
0 recursive calls
0 db block gets
173 consistent gets
0 physical reads
0 redo size
282975 bytes sent via SQL*Net to client
1667 bytes received via SQL*Net from client
111 SQL*Net roundtrips to/from client
0 sorts (memory)
0 sorts (disk)
32768 rows processed
oracle@eemrick:SQL> set autot off;
oracle@eemrick:SQL> select blocks from dba_tables where table_name =
'MYTABLE';
BLOCKS
----------
61
1 row selected.
oracle@eemrick:SQL> select blocks from dba_segments where segment_name =
'MYTABLE';
BLOCKS
----------
64
1 row selected.
oracle@eemrick:SQL> select index_name from user_indexes where table_name =
'MYTABLE';
no rows selected
oracle@eemrick:SQL> select count(*) from mytable;
COUNT(*)
----------
32768
1 row selected.
oracle@eemrick:SQL> spool off;


End Output */

Clue: The missing statement is not "alter system set do_less_work = true;"

Second Life client running on Solaris x64 - contd

Siva Doe - Thu, 2007-01-11 20:38

As promised, here is the update.

For Mads and whomever is interested in building the Second Life client on Solaris (x64), this is what I did.
Please do remember, these are just to get the client build on Solaris. I havent completely run the client yet (nothing beyond the login screen). So, buyer beware.
Basically, I followed the Linux instructions in Second Life Twiki page.
Downloaded and built all the libraries mentioned and copied the libraries and headers in the directory under, 'i686-sunos5', instead of 'i686-linux'
Modified SConstruct  (for scons building).

  • Look for 'linux' and introduce the code for 'sunos5' (replace -DLL_LINUX with -DLL_SOLARIS)
  • Remove the 'db-4.2' entry under libs line.
  • Replace 'yacc' with 'bison -y'; 'lex' with 'flex'; 'g++-3.4' with 'g++' (under /usr/sfw/bin); 'strip' with 'gstrip'

Whichever subdirectory contains 'files.linux.lst', make a copy of it called 'files.sunos5.lst'.
Then comes the code changes. Basically, search for files containing 'LL_LINUX' and add "|| LL_SOLARIS" or "&& ! LL_SOLARIS" as appropriate.
In llcommon/llpreprocessor.h:38, I added  "|| (defined(LL_SOLARIS) && !defined(__sparc))" to the line to set the ENDIAN correctly.
In llcommon/llsys.cpp, I added code to use the output of 'psrinfo -v', instead of reading from '/proc/cpuinfo' for SOLARIS.  Similarly, used 'getpagesize() \* sysconf(_SC_PHYS_PAGES)' to get the "Physical kb". I know there are better ways, but just wanted to get the build completed.
In llmath/llmath.h, I was running into some problems regarding 'isfinite'. I replaced with this.
#define llfinite(val) (val <= std::numeric_limits<double>::max())
The other significant work is in 'llvfs/llvfs.cpp' and 'newview/viewer.cpp'. Replaced the code for 'flock' with the appropriate 'fcntl' code.
In files 'newview/lldrawpoolsky.h' and 'newview/llvosky.cpp', replace the variable 'sun' with 'Sun'. 'sun' in a SunOS is defined already, of course.
In 'newview/viewer.cpp', rewrote the 'do_basic_glibc_backtrace()' to use 'printstacktrace()' instead.
Well, you can follow the above instructions, or send me a mail, I will send the diff output. :-)

For runtime, you will have to set your LD_LIBRARY_PATH as mentioned in the Twiki page.
Thats all I have for now. Will update later, if any.

Regards
Siva

PS: It is indeed a pity that I cant login to a server named after me ;-) (userserver.siva.lindenlab.com)

Pages

Subscribe to Oracle FAQ aggregator