Development

Using an "On Field Value Changes" Event in Oracle Visual Builder Cloud Service

Shay Shmeltzer - Mon, 2017-12-11 16:51

This entry is based on previous entries from John and Shray that deal with the same topic and provide the same type of solution. John's entry was created before VBCS provided the UI id for components, and Shray's entry is dealing with a more complex scenario that also involve fetching new data. So I figured I'll write my version here - mostly for my own future reference if I'll need to do this again.

The Goal is to show how you can modify the UI shown in a VBCS page in response to data changes in fields. For example how to hide or show a field based on the value of another field.

To do this, you need to hook into the HTML lifecycle of your VBCS page and subscribe to events in the UI. Then you code the changes you want to happen. Your gateway into manipulating/extending the HTML lifecycle in VBCS is the custom component available in the VBCS component palette. It provides a way to add your own HTML+JavaScript into an existing page.

The video below shows you the process (along with a couple of small mistakes along the route):

The basic steps to follow:

Find out the IDs of the business object field whose value changes you want to listen to. You'll also need to know the IDs of the UI component you want to manipulate - this is shown as the last piece of info in the property inspector when you click on a component. 

Once you have those you'll add a custom component into your page, and look up the observable that relates to the business object used in the page. This can be picked up from the "Generated Page Model (read-only)" section of the custom component and it will look something like : EmpEntityDetailArchetype

Next you are going to add a listener to your custom component model. Add it after the lines 

//the page view model this.pageViewModel = params.root;

your code would look similar to this:

this._listener = this.pageViewModel.Observables.EmpEntityDetailArchetype.item.ref2Job.currentIDSingle.subscribe(function (value) { if (value === "2") { $("#pair-currency-32717").show(); } else { $("#pair-currency-32717").hide(); } }); CustomComponentViewModel.prototype.dispose = function () { this._listener.dispose(); };

Where you will replace the following:

  • EmpEntityDetailArchetype  should be replaced with the observable for your page model.
  • ref2Job  should be replaced with the id of the column in the business object whose value you are monitoring.
  • pair-currency-32717 should be replaced with the id of the UI component you want to modify. (in our case show/hide the component).

You can of course do more than just show/hide a field with this approach.

Categories: Development

Reverse engineer existing Oracle tables to Quick SQL

Dimitri Gielis - Fri, 2017-12-08 13:21
If you didn't hear about Oracle Quick SQL, it's time to read about it as it's something you have without knowing (it's a packaged app in Oracle APEX) and I believe you should start using :)

Quick SQL enables you to rapidly design and prototype data models using a markdown-like shorthand syntax that expands to standards-based Oracle SQL. You can easily create master detail relationships, check constraints, and even generate sample data.
In my blog post Create the Oracle database objects I go over the history how I created database objects and why I think Quick SQL is great and why I use it.

I guess most people typically use Quick SQL at the start of a new project, as it's the quickest way to create your data model and Oracle database objects. That is my primary use case too, but I started to use Quick SQL even on projects where database objects already exist.

In the project I'm currently involved in, the datamodel was generated by another tool, but as we iterate through the project, tables change, columns get renamed and added, row version were requested, triggers need to be made Oracle APEX aware...

Now we could do those changes manually, but I thought it made much more sense to create the data model in Quick SQL and use the features that come with Quick SQL. By clicking a checkbox we can include a Row version, Quick SQL generates the triggers automatically in an APEX aware form, we can generate as much sample data as we want by adding /insert and we can use all the other features that come with Quick SQL. For example when you want to include a history table in the future it's just another checkbox to click.


It's also easy to check-in the Quick SQL script into source control, together with the generated DDL.
If changes need to be done, we can adapt in Quick SQL and generate the DDL again and we see the changes immediately. It would be nice if Quick SQL could generate the ALTER statements too, but that's not the case yet. But it's easy enough to see the changes that were done by comparing the scripts in source control.

If you also want to reverse engineer an existing model into Quick SQL, here's a script that gives you a head start generating the markdown style format.


I tried the script on the Quick SQL data model itself - the result you see below:


Hopefully you see the benefit of using Quick SQL in existing projects too and the script helps you get there. Also Quick SQL gets frequent updates - in the upcoming release (17.3.4), which is already online, you can add a Security Group ID to every table (to make your app multi-tenant) and you can rename the audit columns to your own naming conventions.
Categories: Development

Scaling Oracle using NVMe flash

Gerger Consulting - Tue, 2017-11-21 07:37
Attend the free webinar by storage expert Venky Nagapudi and learn how to improve the performance of your Oracle Database using new storage technologies such as NVMe flash. 

About the Webinar
Growth in users and data put an ever-increasing strain on transactional and analytics platforms. With many options available to scale platforms, what are the considerations and what are others choosing? Vexata’s VP of Product Management, Venky Nagapudi covers how the latest in storage side technologies, like NVMe flash, can deliver both vast improvements in performance as well as drive down costs and complexity of platforms. He will also cover key use cases where storage-side solutions delivered amazing results for Vexata’s customers.
In this webinar, you will:
  • Hear real-world performance scaling use cases.
  • Review the pros & cons of common scaling options.
  • See specific results of choosing a storage-side solution.


About the Presenter


Venky Nagapudi has 20 years experience in engineering and product management in the storage, networking and computer industries. Venky led product management at EMC and Applied Microsystems. Venky also held engineering leadership roles at Intel, Brocade and holds 10 patents with an MBA from Haas business school at UC Berkeley, an MSEE from North Carolina State University, and a BSEE from IIT Madras.

Sign up now.

Categories: Development

Date Calculations and Queries with Oracle Visual Builder Cloud Service

Shay Shmeltzer - Fri, 2017-11-17 12:10

It's very easy to define a field in a custom object in Oracle Visual Builder Cloud Service to store a date, but when it comes to doing calculations and queries based on this date you'll find that you need to resort to a little bit of JavaScript calculations.

Here are a couple of useful things to know if you are trying to do that.

Calculating Age (or time passed from a date in years)

Let's assume you are storing information about employees and one of the pieces of information you have is their date of birth - the Birthday field in the image below.

How do you show their actual age in years on a page?

You can define a calculated field in your business object - and have VBCS use the "calculate value with formula" as the source for this field.

Your formula would be something like:

(new Date() -new Date($birthdate) )/ (60*60*24*1000*365)

You are calculating the difference between today's date and the birthday field and since the answer is in milliseconds you convert it to years by dividing by the number of milliseconds in a year.

Note that as you type in your formula the dialog shows you the results of the formula below the formula field - quite useful to verify that you are doing it right.

Now your page can show the age of your employees:

Filtering Based on Date

What if you wanted to limit the records shown in the table above to only show employees of a specific age?

The tricky part is that you'll need to do the calculation against the birthday field and not against the age field. The age field is not actually stored anywhere - rather it is calculated on the fly.

Let's take the table shown above and assume we want to limit it to show employees who are younger than 9 years. To do that we'll add a query condition to our table to check that the birthday is larger than the date of (today - 9 years).

The calculation of the date 9 years ago will be with a formula like this:

new Date($current_date-9*365*24*60*60*1000)

Now your table only shows older employees.

Want to have a more dynamic way to define the query criteria - you can adopt the approach I showed in the blog about Creating Custom Search/Query Pages with Visual Builder along with the techniques shown here.

One last note - since not every year has 365 days - the calculation for milliseconds conversion is not completely accurate - but it is quite close.

Categories: Development

Easy(lazy) way to check which programs have properly configured FetchSize

XTended Oracle SQL - Wed, 2017-11-15 15:37
select 
   s.module
  ,ceil(max(s.rows_processed/s.fetches)) rows_per_fetch
from v$sql s
where 
    s.rows_processed>100
and s.executions    >1
and s.fetches       >1
and s.module is not null
and s.command_type  = 3    -- SELECTs only
and s.program_id    = 0    -- do not account recursive queries from stored procs
and s.parsing_schema_id!=0 -- <> SYS
group by s.module
order by rows_per_fetch desc nulls last
/
Categories: Development

Conditional Navigation based on Queries in Oracle Visual Builder Cloud Service

Shay Shmeltzer - Wed, 2017-11-15 13:21

A couple of threads on the Oracle Visual Builder Cloud Service forum asked about writing code in buttons in VBCS that compares values entered in a page to data in business objects and perform conditional navigation based on the values. In a past blog I showed the code needed for querying VBCS objects from the UI, but another sample never hurts, so here is another demo...

For this demo I'm going to show how to do it in a login flow - assuming you have a business object that keeps usernames and passwords, and you want to develop a page where a user types a user/pass combination and you need to verify that this is indeed a valid combination that exist in the business object.

(In reality, if you want to do user authentication in VBCS - you should use the built in security frameworks and not code it this way. I'm just using this as an example.)

Here is a quick video of the working app - with pointers to the components detailed below.

The first thing you'll do is create the business object that hosts the user/pass combination - note that in the video since "user" is a reserved word - the ID for the field is actually "user_" - which is what we'll use in our code later on.

 

Next you'll want to create a new page where people can insert a user/pass combination - to do that create a new page of type "Create" - this page will require you to associate it with a business object, so create a new business object. We won't actually keep data in this new business object. In the video and the code - this business object is called "query".

Now design your page and add the user and pass fields - creating parallel fields in the query business object (quser and qpass in the video). You can then remove the "Save" button that won't be use, and instead add a "validate" button.

For this new button we'll define a new custom action that will contain custom JavaScript code. Custom code should return either a success state - using resolve(); - or failure - using reject();

Based on the success or failure you can define the next action in the flow - in our case we are showing either a success or error message:

success flow

Now lets look at the custom JavaScript code:

require(['operation/js/api/Conditions', 'operation/js/api/Operator'], function (Conditions, Operator) { var eo = Abcs.Entities().findById('Users'); var passid = eo.getProperty('pass'); var userid = eo.getProperty('user_'); var condition = Conditions.AND( Conditions.SIMPLE(passid, Operator.EQUALS,$QueryEntityDetailArchetypeRecord.getValue('qpass') ), Conditions.SIMPLE(userid, Operator.EQUALS, $QueryEntityDetailArchetypeRecord.getValue('quser')) ); var operation = Abcs.Operations().read( { entity : eo, condition : condition }); operation.perform().then(function (operationResult) { if (operationResult.isSuccess()) { operationResult.getData().forEach(function (oneRecord) { resolve("ok"); }); } reject("none"); } ). catch (function (operationResult) { if (operationResult.isFailure()) { // Insert code you want to perform if fetching of records failed alert('didnt worked'); reject("error"); } }); });

Explaining the code:

  • Lines 2-4 - getting the pointers to the business object and the fields in it using their field id.
  • Lines 5-8 - defining a condition with AND - referencing the values of the fields on the page
  • Lins 9-11 - defining the operation to read data with the condition from the business object
  • Line 12 - executing the read operation
  • Line 14-18 - checking if a record has been returned and if it has then we are ok to return success - there was a user/pass combination matching the condition.
  • Line 19 - otherwise we return with a failure.

One recommendation, while coding JavaScript - use a good code editor that will help highlight open/close brackets matches - it would save you a lot of time.

For more on the VBCS JavaScript API that you can use for accessing business components see the doc.

Categories: Development

Meet me in Australia and New Zealand at the OTN Days 2017

Dimitri Gielis - Tue, 2017-11-14 08:32
Tonight I'll start my trip from Belgium to Australia and New Zealand. Although we have a company in New Zealand and Australia, which Lino is managing, I've never been there myself. It has always been my dream to visit the other side of the earth, so I look forward to it :)

I'll present on how I build Oracle APEX apps today (and in the future) and how to make them available for others (cloud and others).

My schedule of the OTN Days 2017 (APAC Tour) looks like this:
If you are in one of those places, I would love to meet you and hear how you use Oracle APEX.
And I'm always up for showing you a live demo of APEX Office Print, you'll see our upcoming AOP 3.2 version as first! Just grab me by my arm and ask :)

In Perth there will also be a Q&A slot - so any Oracle APEX question can be asked there.
Categories: Development

Karamozov

Greg Pavlik - Fri, 2017-11-10 12:02
"Brothers, have no fear of men's sin. Love a man even in his sin, for that is the semblance of Divine Love and is the highest love on earth. Love all God's creation, the whole and every grain of sand in it. Love every leaf, every ray of God's light. Love the animals, love the plants, love everything. If you love everything, you will perceive the divine mystery in things. Once you perceive it, you will begin to comprehend it better every day. And you will come at last to love the whole world with an all-embracing love. Love the animals: God has given them the rudiments of thought and joy untroubled. Do not trouble it, don't harass them, don't deprive them of their happiness, don't work against God's intent. Man, do not pride yourself on superiority to the animals; they are without sin, and you, with your greatness, defile the earth by your appearance on it, and leave the traces of your foulness after you -- alas, it is true of almost every one of us! Love children especially, for they too are sinless like the angels; they live to soften and purify our hearts and, as it were, to guide us. Woe to him who offends a child! Father Anfim taught me to love children. The kind, silent man used often on our wanderings to spend the farthings given us on sweets and cakes for the children. He could not pass by a child without emotion. That's the nature of the man.

At some thoughts one stands perplexed, especially at the sight of men's sin, and wonders whether one should use force or humble love. Always decide to use humble love. If you resolve on that once for all, you may subdue the whole world. Loving humility is marvellously strong, the strongest of all things, and there is nothing else like it....

Brothers, love is a teacher; but one must know how to acquire it, for it is hard to acquire, it is dearly bought, it is won slowly by long labour. For we must love not only occasionally, for a moment, but for ever. Everyone can love occasionally, even the wicked can.

My brother asked the birds to forgive him; that sounds senseless, but it is right; for all is like an ocean, all is flowing and blending; a touch in one place sets up movement at the other end of the earth. It may be senseless to beg forgiveness of the birds, but birds would be happier at your side -- a little happier, anyway -- and children and all animals, if you were nobler than you are now. It's all like an ocean, I tell you. Then you would pray to the birds too, consumed by an all-embracing love, in a sort of transport, and pray that they too will forgive you your sin. Treasure this ecstasy, however senseless it may seem to men."

Introduction to Oracle Developer Cloud Service Issue Tracking REST Interfaces

Shay Shmeltzer - Mon, 2017-11-06 13:38

The task tracking system in Oracle Developer Cloud Service (DevCS) helps your team manage your development priorities and process. DevCS offers a simple web interface for working with the system. However, in some cases you might want to build your own interfaces to interact with the issues. For example, you might want to build a system for end-users to report bugs in your app and you don't want to give them direct access to the DevCS web insterface. In the August 17 update of DevCS  we introduced a set of REST services that will let you build a custom interface that will interact with our issues repository.

The official documentation for the DevCS REST services is here.

I wanted to share some tips to help you get this going in your project. The results are in this short video demo, and the details are below.

Figuring Out The End Points

The documentation gives you the basic end-points you should be calling, but it took me a little bit of time to figure out the full URL to the end point. Turns out the URL is composed in the following way:

https://server/org-id/rest/org-id+project-id/issues/v2/issues

The first parts (server/org-id) are quite easy to get - just copy it from the URL of your project when you look at it in your browser.

The org-id+project-id part is something you can get by looking at the details of your maven repository URL - see the image below - what you are looking for is the part before the /maven/ at the end:

Note that in some projects this will also include a numeric value appended to the project name. Something like developer-oracletemplates_db-oss-devops_20266.

In the video sample below the result URL for the REST that returns the list of issues currently in the system ended up being:

https://myserver/developer-oracletemplates/rest/developer-oracletemplates_adf1221/issues/v2/issues

Creating New Issues

One of the useful services is the /issues/v2/issues/create-form service. It returns a json file that you can edit to specify information about a new task that you want to create.

Note that the file start with : {"createIssue":{"links":.... Before you use the file to insert a new issue, you'll need to remove the  {"createIssue": at the start and the corresponding } at the end of the file. Only then can you use it to submit the POST operation to create an issue.

In the video I used the following command to create the issue in the DevCS:

curl -X POST -u shay@oracle.com https://myserver/developer-oracletemplates/rest/developer-oracletemplates_adf1221/issues/v2/issues/ -d@issue.json -H 'Content-type:application/json'

(the -d allows you to specify the name of the file with the new issue, and the -H specifies the content format).

Now that you have access to the information you can create new systems on top of it using your favorite development tool. At the end of the video you can see a simple issue system I built with Oracle Visual Builder Cloud Service - more on that in a future blog entry.

 

Categories: Development

Exporting and Importing Data from Visual Builder Cloud Service - with REST Calls

Shay Shmeltzer - Thu, 2017-11-02 16:37

Visual Builder Cloud Service (VBCS) makes it very easy to create custom objects to store your data. A frequent request we get is for a way to load and export data from these business objects. As John blogged, we added a feature to support doing this through the command line - John's blog shows you the basic options for the command line.

I recently needed to do this for a customer, and thought I'll share some tips that helped me get the functionality working properly - in case others need some help skipping bumps in the road.

Here is a demo showing both import and export and how to get them to work.

Exporting Data

Export is quite simple - you use a GET operation on a REST service, the command line for calling this using curl will look like this:

curl -u user:password https://yourserver/design/ExpImp/1.0/resources/datamgr/export > exp.zip

The result is a streaming of a zip file, so I just added a > exp.zip file to the command's end. The zip file will contain CSV files for each object in your application.

Don't forget to replace the bold things with your values for username and password, your VBCS server name and the name of the app you are using (ExpImp in my case).

Importing Data

Having the exported CSV file makes it easy to build a CSV file for upload - in the demo I just replaced and added values in that file. Next you'll use a similar curl command to call a POST method. It will look like this:

curl -X POST -u user:password https://yourserver/design/ExpImp/1.0/resources/datamgr/import/Employee?filename=Employee.csv -H "Origin:https://yourserver" -H "Content-Type:text/csv" -T Employee.csv -v

A few things to note.

You need to specify which object you want to import into (Employee after the /import/ in the command above), and you also need to provide a filename parameter that tell VBCS which file to import.

In the current release you need to work around a CORS security limitation - this is why we are adding a header (with the -H option) that indicate that we are sending this from the same server as the one we are running on. In an upcoming version this won't be needed.

We use the -T option to attach the csv file to our call.

Note that you should enable the "Enable basic authentication for business object REST APIs" security option for the application (Under Application Settings->Security). 

Using Import in Production Apps

In the samples above we imported and exported into an application that is still being developed - this is why we used the /design/ in our REST path.

If you want to execute things on an application that you published then replace the /design/ with /deployment/ 

One special note about live applications, before you import data into them you'll need to lock them. You can do this from the home page of VBCS and the drop down menu on the application.

 

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; line-height: 14.0px; font: 11.0px Menlo; color: #000000; background-color: #ffffff} span.s1 {font-variant-ligatures: no-common-ligatures}
Categories: Development

Using Flyway to Manage Oracle DB Versions in the Cloud

Shay Shmeltzer - Thu, 2017-10-19 13:11

This is another entry in my series about managing database scripts/schema as part of agile development. In the past I showed how to use simple SQL and Liquibase to manage schema creation/population scripts, and today I'll show you how to use Flyway.

Flyway log

Flyway is a free open source solution for managing "database migrations" - or basically helping you keep multiple database in synch by tracking and applying changes to the schema structure and data.

Flyway uses simple SQL scripts - which means you can use DB specific syntax - and tracks their execution in the database through a table it maintains. It is very easy to get started with and only has 6 commands that you need to be familiar with.

The main command is "migrate" which will check your database status, and then run all the newer scripts that have yet to be run on that instance.

Flyway uses a directory structure that contains a sql folder where you'll host all your SQL scripts. It uses a naming convention (that can be adjusted) where you start the file name with a Version number (V1, V1.1, V2.1) and then two "_" followed by a description - so something like V1__Create_Emp_Table - will show up as "Create Emp Table" when you issue the "info" command to find out what is the status of a database and which scripts have already run. By the way, the info command will also show you which new scripts are pending to be run on a specific database instance.

In the video below I show how to configure and use Flyway, and how to integrate it into an automatic DevOps process leveraging Oracle Developer Cloud Service. (including task tracking, Git version management of the source, and build execution of the scripts).

Flyway can integrate with various build framework (ant, maven, gradle etc), but since many DB folks are not familiar with those, I chose to use simple command lines in my demo to invoke Flyway. On my laptop and local MySQL DB I just used the Flyway command line utility. However Flyway is not installed by default in the DevCS servers, so I did a little trick:

Flyway is a Java program, so into my DevCS Git repository I uploaded the Flyway directory along with needed jars for flyway and the JDBC driver. Then I looked at the script for invoking the command line and found out the Java command they used and copied it into a regular shell command in my build:

java -cp lib/flyway-commandline-4.2.0.jar:lib/flyway-core-4.2.0.jar org.flywaydb.commandline.Main info -user=fw -password=$Password -url=jdbc:oracle:thin:@ipaddress:1521/servicename

The $Password refers to a build parameter which is encrypted.

The directory structure and files in my Git are shown in this image:

directory structure

 

Categories: Development

Your DBA Career in the Age of Oracle Cloud

Gerger Consulting - Wed, 2017-10-18 11:48
Attend the free webinar by Oracle ACE Director Craig Shallahamer and learn how the Oracle Cloud and the Oracle 18c autonomous database changes your role as an Oracle DBA.
About the Webinar
The cloud is a change that all Oracle DBAs must face. The cloud is here stay, and that means Oracle DBAs need to adapt or get out of the game. It makes no difference if you are a new Oracle DBA or retiring in five years, before us is one of the most significant changes you will ever face.

In this webinar, you'll learn what has happened, what is happening, what you can expect, and what you can do today to ensure you are positioned to thrive in a cloud world full of surprising and exciting opportunities.

Register at this link.
Categories: Development

Introduction to Liquibase and Managing Your Database Source Code

Shay Shmeltzer - Mon, 2017-10-16 10:35

In previous posts I showed how you can manage SQL scripts lifecycle with the help of Oracle Developer Cloud Service (DevCS) as part of an overall Oracle DB DevOps solution. I wanted to add one more utility that might act as an alternative or addition to the SQL script managing - Liquibase.

Liquibase logo

Liquibase is an open source solution for managing revisions of your databse schema scripts. It works across various types of databases, and supports various file formats for defining the DB structure. The feature that is probably most attractive in Liquibase is its ability to roll changes back and forward from a specific point - saving you from the need to know what was the last change/script you ran on a specific DB instance.

Liquibase uses scripts - referred to as "changesets" - to manage the changes you do to your DB. The changesets files can be in various formats including XML, JSON, YAML, and SQL. In the examples below I'm using the XML format.

As you continue to change an enhance your DB structure through the development lifecycle you'll add more changesets. A master file lists all the changeset files (or the directories where they are). In parallel Liquibase tracks in your database which changesets have already run. 

When you issue a liquibase update command, liquibase looks at the current state of your DB, and identifies which changes have already happened. Then it run the rest of the changes - getting you to the latest revision of the structure you are defining.

By integrating Liquibase into your overall code version management system and continuous integration platform you can synch up your database versions with your app version. In my case this would of course mean integration with Oracle Developer Cloud Service (DevCS) - which you get for free with the Oracle Database Cloud Service. In the video below I show a flow that covers:

  • Tracking my DBA tasks in the issue system
  • Modifying a local MySQL DB with Liquibase (doing forward and backward rolls)
  • Adding a change set defining a new table
  • Committing to Git
  • Automatic build implementing the changes in Oracle Database Cloud Service
  • Automatic testing with UT/PLSQL

Here is a quick 10 minute demo:

For those who want to try and replicate this, here are some resources:

A changeset that creates a "department" table with three columns:

A changeset that creates PL/SQL function, package and procedure. Note that in line 3 the dbms="oracle" means this script will only run when we are connected to an Oracle DB:

create or replace function betwnstr( a_string varchar2, a_start_pos integer, a_end_pos integer ) return varchar2 is begin return substr( a_string, a_start_pos, a_end_pos - a_start_pos+1 ); end; create or replace package test_betwnstr as -- %suite(Between string function) -- %test(Returns substring from start position to end position) procedure basic_usage; end; create or replace package body test_betwnstr as procedure basic_usage is begin ut.expect( betwnstr( '1234567', 2, 5 ) ).to_equal('2345'); end; end; A changeset that adds a record to a table. Line 8 has the rollback tag that defines how to do a rollback for this insert: delete from department where id=20

 

A few tips about my DevCS project and build setup.

1. For the sake of simplicity, I loaded the liquibase and JDBC jar files into my git repository - this makes it easy for my build steps to find the files and execute them. I'm guessing you could also use Maven to host those.

2. I use a password parameter for my build so I don't need to hardcode the password adding a bit of security to my build. Reference teh parameter in your build with a $ sign - $password

3. Want to learn more about test automation with ut/PLSQL - check out this blog entry.

 

 

Categories: Development

APEX Office Print 3.1 released - support for Docker

Dimitri Gielis - Mon, 2017-10-16 04:18
Last week we release APEX Office Print (AOP) 3.1, our best release ever :)

AOP was already the easiest and most fully integrated printing and exporting solution for Oracle APEX, but with every new release we allow you to customise the way you use AOP a bit more and add more advanced functionalities.

As more and more bigger companies are using AOP, we focussed in this release more on enterprise features, for example, native HTTPS support, end-to-end and customisable debugging, a new queuing system for large amounts of prints and overall performance enhancements and general improvements.
You can read more about this release in our release history.

One other addition I want to highlight is the ability to run AOP in a Docker configuration.
The Docker image is available for our Gold and Enterprise license.

Docker is the world’s leading software container platform. If this concept is new for you, you can read more at What is Docker?



In the previous days Martin Giffy D'Souza blogged about How to Setup Oracle DB 12.2 Docker Container and Docker Oracle and APEX and Roel Hartman talked about Dockerize your APEX development environment. Those are some excellent posts how to get started with Docker in an Oracle Database and APEX context.

The most important reason for us to make an APEX Office Print docker image available was to ease the installation of multiple AOP instances even more and give the possibility to scale AOP in an enterprise way.

Here's a video how you are up and running with our AOP docker image in less than a minute:



You also find the detailed steps in the AOP documentation.

Juergen Schuster and Martin Giffy D'Souza did a podcast with me end of August, where I talk a bit about AOP and our development too.

If you are not yet on APEX Office Print 3.1, go and download the latest version, even when you are not enterprise, it's worthwhile the upgrade. We updated our AOP Sample Application with some new examples too.


Happy printing and exporting from Oracle APEX with AOP :)

Categories: Development

Talking about APEX Reporting and AOP @ Montreal Oracle Dev Day 2017

Dimitri Gielis - Wed, 2017-10-11 01:00
For those in Montreal and the surrounding area I encourage you to come out to the Montreal Oracle Dev Day on October 25th (8:30-4:30 at Centre for Sustainable Development).

Here’s a summary agenda of the presentations with the full agenda here:
Aside from the presentations you will have plenty of opportunity to network and share your Oracle development experiences. All speakers will be available all day so feel free to bring your APEX questions!

You can register now online.

As I'm not that much in this part of the world it would be great to meet in person. I would love to hear your thoughts on APEX Office Print (AOP) too.  If you have any questions, feedback or just want to talk how to use AOP in your environment, don't hesitate to come up to me. I'm more than happy to talk to you :)

Categories: Development

JavaScript, Node.js, JET and APEX day, 7-NOV-2017, Belgium

Dimitri Gielis - Mon, 2017-10-09 04:07
With great pleasure I can announce we have a special APEX Meetup in Belgium on Tuesday, November 7, 2017.


Dan McGhan is our special guest and will present on different technologies that will enrich your life as an Oracle APEX Developer too.

Unlike our other APEX Meetups, we decided to do a full day event, with food and drinks :) The event is sponsored by APEX R&D, iAdvise, ODTUG and Oracle.

Agenda:
• 8.30 am - 9 am: welcome coffee and registration
• 9 am - 10 am: Setup help for those that couldn't do it before (VM with Oracle & Node.js - optional)
• 10 am -11 am: Intro to JavaScript for PL/SQL and APEX Developers (language mechanics and DOM basics)
• 11 am -12 pm: Intermediate JavaScript for PL/SQL and APEX Developers (Ajax, promises, etc.)
• 12 pm - 1 pm: Lunch
• 1 pm - 2 pm: Intro to Node.js
• 2 pm - 3 pm: RESTful APIs with Node.js
• Coffee Break
• 3.30 pm - 4.30 pm: Intro to Oracle JET
• 4.30 pm - 5.30 pm: Using JET and Node.js with APEX
• Closing Drink

We have a capacity of 75 people, so if you didn't register yet, you find the link here.
Categories: Development

Choosing the right template and print to PDF from your Oracle APEX application

Dimitri Gielis - Fri, 2017-10-06 07:19
This post is part of a series of posts: From idea to app or how I do an Oracle APEX project anno 2017

To motivate players to exercise more, I thought it would be a good idea to give the players a diploma (certificate) when they complete the multiplication table. They will be able to download or print this certificate if they have a specific score. I designed it that we have one official diploma of mtable.online, but there's also the ability so teachers or people at home can print their own certificate in the template or colours they want.

Creating the diploma

Creating a certificate in Microsoft Office is not that hard, in fact online you find many certificate templates you can just download. Mostly are in Word or Powerpoint format.



In our multiplication table project we have the concept of teams. One or more players can be part of one or more teams. A team could be for example a classroom and the students are the players within that team. I want the team manager be able to upload a template for the certificate (diploma) of the team.

How integrating those certificates in Oracle Application Express (APEX)?

With APEX Office Print, this becomes very easy!


APEX Office Print (AOP) is a print server for Oracle Application Express (APEX) which allows you to define your template in Word, Excel, Powerpoint, HTML or Markdown and merges it with your data in the Oracle database. As output you can select PDF, Word, Excel, Powerpoint, HTML or Markdown. APEX Office Print comes with an APEX Plug-in and PL/SQL API that makes it very easy to select the template and the data in any Oracle APEX application. AOP is also smart and understands Oracle APEX meta-data, so for example printing one or more Interactive Reports or Grids to Excel or PDF is done in a breeze. Till date, APEX Office Print (AOP) is the most integrated, easiest and flexible printing and data exporting solution for Oracle Application Express (APEX).
Full disclosure, my company APEX R&D is the maker of AOP, so I might be a bit biased, but you can check this youtube video where different APEX printing solutions are discussed.

Installation of the AOP plugin

When you go to the APEX Office Print website, you can download the cloud or on-premise version and try it out for 100 days. You just have to click sign up, provide an email go to Downloads and click the Cloud package.


 Extract the zip file that was downloaded and locate the db folder and plugin folder.



Import the APEX Plug-in by going into your APEX app, to Shared Components, Plug-ins, hit the import button and choose the dynamic_action_plugin_be_apexrnd_aop_da_51.sql file. Depending the version of APEX, you might need to choose the _50.sql file (in case of APEX 5.0). Follow the wizard to import the plug-in.



The plug-in calls a package, so there's one more step to do in order to make the plug-in work. Go to SQL Workshop > SQL Scripts and hit the Upload button and select the file aop_db_pkg.sql from the db folder.



Next, click on the Run button to run the script, which will install the AOP_API3_PKG PL/SQL package.

That's it - you successfully installed the AOP plug-in and are now ready to use it in your application.

Note: in the above example we installed the Dynamic Action plug-in. AOP also comes with a process type plugin, in case you prefer a process over a dynamic action.


Calling the AOP plugin from your page

Create a new dynamic action, for example Click on a button and as Action select the APEX Office Print (AOP) Plug-in:



The plug-in is very flexible and has many options, yet it's so easy to use. You first tell the plugin where your template is; in Static Application Files (Shared Components), in a table (define your SQL), a url, the filesystem, ... you just select where and tell it which one.

Next you have to tell which data you want to use, you can define SQL, PL/SQL, URL or even the static id of the region. AOP is so smart it will understand if you put the static id of an Interactive Report or Grid, a Classic Report or even a JET or other chart. Behind the scenes AOP is reading the meta-data, so it will use whatever is behind that region as source. This is one of the most liked features by our customers and something no other printing solution offers.



The plug-in has build-in help and examples, so it's more easy to know how to use the plugin.

You can define some other settings (which items to submit in session state, some special report settings etc.) and finally the output you want; PDF, Word, Excel, you name it.

Print the diploma

When you go to mtable.online, you can view the highscores for all players, or the highscores within a team.



Based on some rules you might be able to print a diploma. The logic which defines if you are allowed to get a diploma I wrapped in a view:


Now the interesting thing is that we can define the selection of the template dynamically in the AOP plugin. So I've some PL/SQL code that returns the correct template. If a team is selected, we will use that template of the team, if it doesn't have one, or if you are looking at all players, we will use the default template defined by us. As described in the first paragraph, the owner of the team can define his own template, they don't need to contact us (the developers) anymore. The just create their template in Word, Excel or Powerpoint and upload it.

From the very start when developing AOP, this was a main goal; we want the business users be able to create their own template. As a developer we just provide them the data, how it looks like the business people can define. With other printing solutions, I had to spent hours and hours redefining the template in XSL-FO, iReport, or other tools, but with AOP, as a developer my job is done, the moment I write my query :)

Back to the diploma in mtable.online; as default the template I used was a Powerpoint I downloaded from the certificate templates site. The only thing I changed was adding the substitution variables that AOP would understand: {player_name}, {mi_ss}, {play_date} (those are the columns in my query - see further)


 For my team, I went with a Word Template:


 And my plug-in looks like this:


Whenever the diploma link is clicked in the report, the AOP Dynamic Actions kicks in and generates the PDF, based on the template from the team (Word) or the default (Powerpoint). That's it :)

APEX Office Print is even more used in the Multiplication Table project; namely when you go to the  details of an exercise, which are shown in an Interactive Grid, AOP exports this Interactive Grid to our own Excel template. I'll do a dedicated post on that as there're more interesting steps to talk about with adding buttons and custom dynamic actions to an Interactive Grid.

If you didn't try APEX Office Print 3.0 yet, you can sign up for a trial for free. In a few days we ar also releasing AOP 3.1, our best version ever :)
Categories: Development

PL/SQL functions: Iterate and keys for associative arrays

XTended Oracle SQL - Sun, 2017-10-01 17:54

Unfortunately associative arrays still require more “coding”:
we still can’t use “indices of” or “values of” in simple FOR(though they are available for FORALL for a long time), don’t have convinient iterators and even function to get all keys…
That’s why I want to show my templates for such things like iterator and keys function. You can adopt these functions and create them on schema level.

declare 
   type numbers  is table of number;
   type anumbers is table of number index by pls_integer;
   a anumbers;
   i pls_integer;

   function iterate( idx in out nocopy pls_integer, arr in out nocopy anumbers) 
      return boolean
   as pragma inline;
   begin
      if idx is null 
         then idx:=arr.first; 
         else idx:=arr.next(idx);
      end if;
      return idx is not null;
   end;

  function keys(a in out nocopy anumbers) return numbers as
     res numbers:=numbers();
     idx number;
     pragma inline;
  begin
     while iterate(idx,a) loop
        res.extend;
        res(res.count):=idx;
     end loop;
     return res;
  end;

begin
   a(1):=10;
   a(3):=30;
   a(5):=50;
   a(8):=80;
   -- iterate:
   while iterate(i,a) loop
      dbms_output.put_line(a(i));
   end loop;

   -- keys:
   for i in 1..keys(a).count loop
      dbms_output.put_line(a(keys(a)(i)));
   end loop;

end;
Categories: Development

How to group connected elements (or pairs)

XTended Oracle SQL - Thu, 2017-09-28 22:03

I see quite often when developers ask questions about connected components:

Table “MESSAGES” contains fields “SENDER” and “RECIPIENT”, which store clients id.
How to quickly get all groups of clients who are connected even through other clients if the table has X million rows?
So for this table, there should be 4 groups:

  • (1, 2, 4, 8, 16)
  • (3, 6, 12)
  • (5, 10, 20)
  • (7, 14)
  • (9, 18)
SENDERRECIPIENT122436485106127148169181020

Of course, we can solve this problem using SQL only (model, recursive subquery factoring or connect by with nocycle), but such solutions will be too slow for huge tables.

Example of SQL solution

with 
   t(sender,recipient) as (select level,level*2 from dual connect by level<=10)
,  v1 as (select rownum id,t.* from t)
,  v2 as (select id, account
          from v1
           unpivot (
             account for x in (sender,recipient)
           ))
, v3 as (
           select
              id
             ,account
             ,dense_rank()over(order by account) account_n
             ,count(*)over() cnt
           from v2)
, v4 as (
           select distinct grp,account
           from v3
           model
                dimension by (id,account_n)
                measures(id grp,account,cnt)
                rules
                iterate(1e6)until(iteration_number>cnt[1,1])(
                   grp[any,any] = min(grp)[any,cv()]
                  ,grp[any,any] = min(grp)[cv(),any]
                )
)
select
   listagg(account,',')within group(order by account) s
from v4
group by grp

[collapse]

In such situations it’s much better to adopt standard algorithms like Quick-find or Weighted quick-union for PL/SQL.
The first time I wrote such solution about 5 years ago and I even posted here one of the latest solutions, but all of them were not universal, so I’ve created the package today with a couple of functions for most common problems: XT_CONNECTED_COMPONENTS

It contains 2 functions based on Weighted quick-find quick-union algorithm:

  • function get_strings(cur in sys_refcursor, delim varchar2:=’,’) return strings_array pipelined;
    It accepts a cursor and returns found connected components as table of varchar2(v_size). You can change v_size in the package definition.
    Input cursor should contain one Varchar2 column with linked strings, for example: ‘a,b,c’.
    You can also specify list delimiter, by default it is comma.
    Examples:

    select * from table(xt_connected_components.get_strings( cursor(select ELEM1||','||ELEM2 from TEST));
    select * 
    from
     table(
       xt_connected_components.get_strings( 
         cursor(select 'a,b,c' from dual union all
                select 'd,e,f' from dual union all
                select 'e,c'   from dual union all
                select 'z'     from dual union all
                select 'X,Y'   from dual union all
                select 'Y,Z'   from dual)));
    COLUMN_VALUE
    -----------------------------------------
    STRINGS('X', 'Y', 'Z')
    STRINGS('a', 'b', 'c', 'd', 'e', 'f')
    STRINGS('z')
    
    
  • function get_numbers(cur in sys_refcursor) return numbers_array pipelined;
    This function also returns connected components, but for numbers.
    Input cursor should contain two columns with linked numbers.
    Examples:

    select * 
    from table(
            xt_connected_components.get_numbers( 
              cursor(
                select sender_id, recipient_id from messages
            )));
    select * 
    from
      table(
        xt_connected_components.get_numbers( 
           cursor(
              select level   account1
                   , level*2 account2 
              from dual 
              connect by level<=10
        )));
    SQL> select *
      2  from
      3    table(
      4      xt_connected_components.get_numbers(
      5         cursor(
      6            select level   account1
      7                 , level*2 account2
      8            from dual
      9            connect by level<=10
     10*     )))
    SQL> /
    
    COLUMN_VALUE
    ------------------------
    NUMBERS(1, 2, 4, 8, 16)
    NUMBERS(3, 6, 12)
    NUMBERS(5, 10, 20)
    NUMBERS(7, 14)
    NUMBERS(9, 18)
    

How to install:
Download all files from Github and execute “@install” in SQL*Plus or execute them in another tool in the following order:
xt_connected_components_types.sql
xt_connected_components_pkg.sql
xt_connected_components_bdy.sql

Download URL: https://github.com/xtender/xt_scripts/tree/master/extra/xt_connected_components

Categories: Development

Ampersand instead of colon for bind variables

XTended Oracle SQL - Wed, 2017-09-27 10:22

I’ve troubleshooted one query today and I was very surprised that bind variables in this query were specified with &ampersand instead of :colon! I have never seen this before and I couldn’t find anything about this in documentation…
Unfortunately SQL*Plus doesn’t support ampersand yet, even if you disable define (“set define off”),
so I’ve tested such behaviour with this code:

set def off serverout on
exec declare s varchar2(1); begin execute immediate 'select 1 from dual where dummy=&var' into s using 'X'; dbms_output.put_line(s); end;

And it really works! //at least on 11.2.0.2 and 12.2.0.1

SQL> set def off serverout on
SQL> exec declare s varchar2(1); begin execute immediate 'select 1 from dual where dummy=&var' into s using 'X'; dbms_output.put_line(s); end;
1

PL/SQL procedure successfully completed.

SQL> select substr(sql_text,1,40) stext,sql_id,executions,rows_processed from v$sqlarea a where sql_text like '%dual%&var';

STEXT                                 SQL_ID        EXECUTIONS ROWS_PROCESSED
------------------------------------- ------------- ---------- --------------
select 1 from dual where dummy=&var   ckkw4u3atxz02          3              3

SQL> select * from table(dbms_xplan.display_cursor('ckkw4u3atxz02'));

PLAN_TABLE_OUTPUT
--------------------------------------------------------------------------
SQL_ID  ckkw4u3atxz02, child number 0
-------------------------------------
select 1 from dual where dummy=&var

Plan hash value: 272002086

--------------------------------------------------------------------------
| Id  | Operation         | Name | Rows  | Bytes | Cost (%CPU)| Time     |
--------------------------------------------------------------------------
|   0 | SELECT STATEMENT  |      |       |       |     2 (100)|          |
|*  1 |  TABLE ACCESS FULL| DUAL |     1 |     2 |     2   (0)| 00:00:01 |
--------------------------------------------------------------------------

Predicate Information (identified by operation id):
---------------------------------------------------

   1 - filter("DUMMY"=:VAR)


18 rows selected.

Update: Btw, it works for SQL only, not for PL/SQL:

SQL> var v varchar2(1);
SQL> begin &v = 'Z'; end;
  2  /
begin &v = 'Z'; end;
      *
ERROR at line 1:
ORA-06550: line 1, column 7:
PLS-00103: Encountered the symbol "&" when expecting one of the following:

SQL> exec &v := 'X';
BEGIN &v := 'X'; END;

      *
ERROR at line 1:
ORA-06550: line 1, column 7:
PLS-00103: Encountered the symbol "&" when expecting one of the following:
The symbol "&" was ignored.
SQL> exec :v := 'X';

PL/SQL procedure successfully completed.

SQL> select * from dual where dummy=&v
  2  ;

D
-
X

And we can can use mixed placeholders:

SQL> select * from dual where dummy=&v and &v=:v;

D
-
X
Categories: Development

Pages

Subscribe to Oracle FAQ aggregator - Development