What the customer was doing was just doing Defines and replacing every 5th character with an "*" so that the data would look like:
Barr* Pau* Dur*ndHowever, that was quite tedious for 6X30 fields.
So, Barry Durand suggested that the customer use the Clean command to change certain characters to "*". So replacing all of the vowels would make the data look like:
B*rry P**l D*r*ndThis is definitely not what the Clean command and the $clean function were designed for but it is a good use.
Note: here are the commands:
>set cleanchar "*" >clean "a","e","i","o","u"
$Clean Function and Clean Command
The $Clean function in Suprtool and Clean command in STExport now have the ability to replace a character to be cleaned with null or nothing. If you set the cleanchar as being:
Set Cleanchar "<null>"Suprtool will remove the character specified in the clean command and effectively shift the text to the left and blank out the portion at the end.
$Number Function
The $number had a bug whereby it would add on two zeroes and or bad data if the input number did not have a decimal point. We have fixed the bug so that the number function no longer adds the two digits on the end in error.
However, some users worked around this issue by doing the following:
>extract target = $truncate($number(conv-field) / 100)Since some users, used this work around, the fix to the $number function will then return incorrect results. Therefore, we added the set numbug command to have Suprtool revert from the correct behaviour to continue to have the bug.
By default, Suprtool will just convert the number and not add on the data at the end, however, if you have used the work around then you can add the command:
>set numbug onto the script directly or globally in your suprmgr file.
Oracle Integers
Suprtool by default maps certain numeric fields into packed-decimal data types when they have more than one decimal place:
Precision |
Decimal Places |
Suprtool Data-Type |
None |
Any |
8-byte IEEE |
1-4 |
Zero |
2-byte Integer |
5-9 |
Zero |
4-byte Integer |
1-9 |
Non-zero |
Packed-decimal |
10-27 |
Any |
Packed-decimal |
28-38 |
Any |
8-byte IEEE |
The new setting, Set Oracle Integer On, changes the Suprtool format from packed-decimal to Integer based on the size of the Number:
Precision |
Decimal Places |
Suprtool Data-Type |
1-4 |
Any |
2-byte Integer |
5-9 |
Any |
4-byte Integer |
10-27 |
Any |
8-byte Integer |
Output=input
Suprtool now supports the Output=input command, whereby Suprtool sorts a file onto itself. This is one of the last features of MPE Suprtool that was missing in HP-UX Suprtool. It allows you to sort a large file without requiring the disk space for both an input and output copy at the same time.
Oracle Dynamic Load
We have further revised the Oracle Dynamic Load to do the following:
1) Attempt to load libclntsh.sl using the dynamic path.
2) Attempt to load libclntsh.so using the dynamic path feature.
3) Manually load $ORACLE_HOME/lib32/libclntsh.sl
Suprtool now dynamically loads all of the Oracle routines that it needs at startup. This insures that Suprtool is using calls only for your version of Oracle and provides a more stable environment.
If Suprtool fails to load the Oracle library, you will see the message:
Warning: Your oracle library could not be loaded.This means that any reference to Oracle calls will not function. If you do not have the Oracle option enabled in Suprtool you will not see the errors.
Eloquence Dynamic Load
Previously, Suprtool would attempt to load the Eloquence libraries and report an error message if the libraries were not loaded. This is no longer the default behaviour. Suprtool will try to load the libraries function, however Suprtool will not report an error by default. To check if your libraries were loaded you can run Suprtool with the -lw option:
./suprtool -lw
Oracle Connections
Due to recent patches and changes in Oracle Security, older versions of Suprtool could not connect to some Oracle databases. We have therefore re-written all of the code to connect to databases to use more up to date OCI functions.
By default, Suprtool will now use this new method of connecting. This version also improves the parsing of the open command, allowing for 30 characters each for the username, password and dbname.
This also allows for connection to databases on other servers.
>Open Oracle username/password@dbnameSet Oracle OpenFix On. Set Oracle OpenFix forces Suprtool to use the olog call, when connecting to Oracle databases. This option should no longer be necessary, now that the default is to use the new connection method.
Set Oracle OpenOld On Set Oracle OpenOld forces Suprtool to use the orlon call, when connecting to Oracle databases. This should only be used if connection to Oracle 7 databases.
Table Command
The Table command is now capable of reading files greater than 4Gb. The MPE version of Suprtool already does this.
Bugs Corrected in Suprtool 4.9
On HP-UX, Table was not releasing memory-mapped space properly. This may have been the cause behind a number of weird problems over the last couple of years.
Suprlink's Join command would not build the output file large enough if the input file flimit was significantly smaller than the Link file. This was only on MPE, since there is not such thing as a file limit on HP-UX.
For a complete list of problems corrected, go to www.robelle.com/products/whatsnew.html#suprtool
We recently received the following inquiry about extracting out portions of year, month and day, from numeric dates:
Support Team,Our response:Here is what I need to do, I have an I2 field with a date stored in YYYYMMDD format and what I need to be able to do is redefine the field so I can get to each piece.
example:
booking-date[1],4 = Century booking-date[5],2 = Month booking-date[7],2 = Day
There is a way to isolate year, month and day in one step. It just involves some simple math in order to do what you want. It is not entirely obvious how to do this though.
Myfile is a file with a double integer date field called "a" in the format ccyymmdd, below is a method to extract each portion.
>in myfile >def year,1,4,display >def month,1,2,display >def day,1,2,display >ext year=a / 10000 >ext month=(a / 100) mod 100 >ext day=a mod 100 >list >xeq >IN MYFILE.NEIL.GREEN (0) >OUT $NULL (0) YEAR = 2005 MONTH = 2 DAY = 7
You can also use the $edit function to isolate portions of the date and make the date more readable.
>in myfile >def dispdata,1,10,byte >ext dispdata=$edit(a,"9999 99 99") >list >xeq >IN MYFILE.NEIL.GREEN (0) >OUT $NULL (0) DISPDATA = 2005 02 07
We often get asked how to prompt for input from a jobstream. Now our first suggestion is to see if they have Streamx from Vesoft which is a portion of the Security/3000 product.
If the customer does not have this product, then the other option is to create a command file that uses MPE commands and I/O re-direction to make a file that can be streamed.
In the example below, the first step is to prompt the user for the selection criteria,in this case the Catalog Code, which gets stored into the variable CATCODE.
The next steps are used to remove any files in the users logon called CATINVT, both permanent and/or temporary. The I/O redirection to $NULL is used to hide any error messages that would print if the CATINVT file did not exist, which is a perfectly valid situation.
The rest of the command file is to echo out commands to a file which is then subsequently streamed.
INPUT CATCODE;PROMPT="ENTER THE CATALOG CODE:" PURGE CATINVT > $NULL PURGE CATINVT,TEMP > $NULL ECHO !!JOB CATINVT,JOBS.SGAII >> CATINVT ECHO !!PURGE CATINVTF >> CATINVT ECHO !!SUPR >> CATINVT ECHO BA MACITM,5,READALL >> CATINVT ECHO GET OFFER-ITEMS >> CATINVT ECHO DEF CATALOG,OFFER-ITEM[1],8 >> CATINVT ECHO DEF TEXTEDPNO,OFFER-ITEM[9],8,DISPLAY >> CATINVT ECHO DEF EDPNO,1,4,DOUBLE >> CATINVT ECHO IF CATALOG='!CATCODE' >> CATINVT ECHO EXT EDPNO=TEXTEDPNO >> CATINVT ECHO EXT PAGE-NO >> CATINVT ECHO OUT CATINV01,LINK,TEMP >> CATINVT ECHO X >> CATINVT ECHO IN CATINV01 >> CATINVT ECHO EXT EDPNO,PAGE-NO >> CATINVT ECHO SORT EDPNO >> CATINVT ECHO OUTPUT=INPUT >> CATINVT ECHO X >> CATINVT ECHO TABLE TAB1,EDPNO,SORTED,CATINV01 >> CATINVT ECHO GET ITEM-MAST >> CATINVT ECHO ITEM EXPECTED-DATE,DATE,CCYYMMDD >> CATINVT ECHO IF $LOOKUP(TAB1,EDP-NO) >> CATINVT ECHO EXT EDP-NO,ITEM-NO,DESCRIPTION,CURRENT-BO,& >> CATINVT ECHO AVAILABLE-INV,STATUS,PO-NUMBERS,& >> CATINVT ECHO EXPECTED-DATE,NEXT-QTY >> CATINVT ECHO SORT EDP-NO >> CATINVT ECHO OUT CATINV02,LINK,TEMP >> CATINVT ECHO X >> CATINVT ECHO LINK IN CATINV02 BY EDP-NO >> CATINVT ECHO LINK LINK CATINV01 BY EDPNO >> CATINVT ECHO LINK OUT CATINV03,TEMP >> CATINVT ECHO LINK EXIT >> CATINVT ECHO IN CATINV03 >> CATINVT ECHO EXT ITEM-NO,DESCRIPTION,AVAILABLE-INV,& >> CATINVT ECHO CURRENT-BO,STATUS,PAGE-NO,PO-NUMBERS,& >> CATINVT ECHO EXPECTED-DATE,NEXT-QTY >> CATINVT ECHO SORT PAGE-NO >> CATINVT ECHO SORT ITEM-NO >> CATINVT ECHO OUT CATINV04,LINK,TEMP >> CATINVT ECHO X >> CATINVT ECHO EXPORT IN CATINV04 >> CATINVT ECHO EXPORT DATE YYYYMMDD '/' >> CATINVT ECHO EXPORT HEADING FIELDNAME >> CATINVT ECHO EXPORT OUT CATINVTF >> CATINVT ECHO EXPORT EXIT >> CATINVT ECHO EXIT >> CATINVT ECHO !!EOJ >> CATINVT SAVE CATINVT STREAM CATINVT
This week I received the following request on how to do something using Suprtool. The issue was that the customer wanted to do a table lookup and subsequent Update from the table data, however, the record or table key needed to match on two values, one that was a double integer and the other was a single integer(well actually a logical, but the same number of bits). The solution, once you wrap your head around it, is amazingly simple.
The Problem
I want to take the value of a field in an old database and put it into the same field for the same record in another database.SolutionThe key path is ACCOUNT. However it needs to also compare on the non-key field SUFFIX. If ACCOUNT and SUFFIX match then the YTD-INT field I extracted from the OLDDB:LOAN-FILE needs to be put into the YTD-INT field of the NEWDB:LOAN-FILE.
I can make the link file by:
ba olddb get loan-file ext account ext suffix ext ytd-int sort account sort suffix output loanfile link xeqHowever, how do I get this field into the corresponding record in the "newdb:loan-file" dataset?The one thing I wasn't sure of with the $LOOKUP is that the comparison needs to be on both the ACCOUNT field and the SUFFIX field. If they match between the two sets, then the YTD-INT field from the OLDDB can replace the value in that field of the NEWDB. Here is the form for the loan-file:
LOAN-FILE Detail Set# 13 Entry: Offset ACCOUNT I2 1 (!ACCOUNT-FILE(SUFFIX)) SUFFIX K1 5 NOTE X8 7 . . . YTD-INT I2 121 FILLER05 5I1 1221 UPDATE-COUNTER I1 1231
Below is a sample solution, the key to this is just defining a single field that is equivalent to the number and location of the bytes that are in the two values that must match which for clarity purposes are the same in my samples below.
Your two key values are:
ACCOUNT I2 equivalent to 4 bytes SUFFIX K1 equivalent to 2 bytesThis makes for a total of six bytes.
My example below is essentially the same:
>form lineitems Database: sample TPI: Eloquence B.07.00 B.07.00.21 LINEITEMS Detail Set# 5 Entry: Offset ORDERID I2 1 (!ID) ORDERLINE I1 5 ITEMNO X16 7 (PARTS) QTY E4 23 PRICE E4 31 ITEMCOUNT X2 39 DELIVDATE I2 41So the first step is to prepare the data that you want to update the new database with, which in the example given was for account, suffix and then the data value that you wanted to update which is ytd-int. In my case the key fields are orderid, orderline and delivdate for the update value.
:comment :comment Extract from old database :comment >set fastread on >base sample >get lineitems >def mykey,1,6,byte {define two fields as one as byte type} >ext mykey >ext delivdate=20051213 {just picked an arbitrary date} >sort mykey >out fortable,link >xeq IN=136, OUT=136. CPU-Sec=1. Wall-Sec=1.You will not be able to read the integer data in the byte field, however, since it is all just bits and bytes the correct things will happen when loading and doing the $table lookup. This is what the form command will look like and some of the data:
>in fortable >form File: fortable (SD Version B.00.00) No linefeeds Entry: Offset MYKEY X6 1 <Now the next step involves loading the table with the data that you want to update in the new database:> DELIVDATE I2 7 Entry Length: 10 Blocking: 1 >num 5 >l >xeq >IN fortable (0) >OUT $NULL (0) MYKEY = ...¤.. DELIVDATE = 20051213 >IN fortable (1) >OUT $NULL (1) MYKEY = ...¤.. DELIVDATE = 20051213 >IN fortable (2) >OUT $NULL (2) MYKEY = ...O.. DELIVDATE = 20051213 >IN fortable (3) >OUT $NULL (3) MYKEY = ...O.. DELIVDATE = 20051213 >IN fortable (4) >OUT $NULL (4) MYKEY = ...U.. DELIVDATE = 20051213 Warning: NUMRECS exceeded; some records not processed. IN=6, OUT=5. CPU-Sec=1. Wall-Sec=1.
>base sample Database password [;]? >get lineitems >def mykey,1,6 >table mytable,mykey,file,fortable,data(delivdate) There are 136 entries in MYTABLE >if $lookup(mytable,mykey) >update >ext delivdate=$lookup(mytable,mykey,delivdate) >xeq IN=136, OUT=136. CPU-Sec=1. Wall-Sec=1.So to translate my example to your environment the technique is essentially the same, just the difference in file and field names:
:comment :comment get data from olddb :comment base olddb get loan-file def mykey,1,6 ext mykey ext ytd-int sort mykey out fortable,link xeq :comment :comment update the new :comment base newdb get loan-file table mytable,mykey,file,fortable,data(ytd-int) if $lookup(mytable,mykey) update ext ytd-int=$lookup(mytable,mykey,ytd-int) xeqYou can add selection criteria to suit, depending on whether or not you need to exclude records from the old database or exclude records that would be updated.
Subject: Suprtool in Batch
Can you please send me a sample batch file using suprtool to extract data to a file in MPE?
Thank you.
Answer:
All you need to do is something similar to:
!JOB MYJOB,USER.ACCOUNT ! !RUN SUPRTOOL.PUB.ROBELLE BASE MYBASE GET MYSET EXTRACT FIELD1,FIELD2,FIELD3 OUTPUT MYFILE,ASCII EXIT ! !EOJBasically, the format is to just have your standard job card, run the Suprtool program, have the suprtool commands, EXIT from Suprtool and have the !EOJ to end the job.
Hopefully this should be enough to get you started.
Subject: Outfile using Datestamp
Thank you. That has gotten me started. I want to stream this job daily
and have an output file that is based on the date using the format
HPYYMMDD. So for example today's output file should be HP040908
How can I do this with Suprtool?
Answer:
When I first read your email, I thought you wanted to create a data field within the output file that contained today's date in some format.
Of course, for that I'd do something like
>set varsub on >define mydate,1,8,byte >extract mydate="!hpyyyymmdd"But now from reading the email more carefeully, I see that what you want is to have the *name* of the output file date-stamped. Here's how I'd do that...
!JOB MYJOB,USER.ACCOUNT ! !file myout=HP![rht(hpyyyymmdd,6)] !run suprtool.pub.robelle base mybase get... extract... output myout exit ! !EOJPlease give this a try and let us know if this works for you.
Subject: FTP Problem
That worked! However, I now have a problem transferring the file to the ftp server. The 'PUT myout' command fails because it is looking for myout instead of HP040908. Do you have any idea how I can get around this?
!COMMENT *----------------------------------------------------------*
!COMMENT * Transfer STDLIST to pop.digital-college.edu *
!COMMENT *----------------------------------------------------------*
!RUN FTP.ARPA.SYS
OPEN ftp.digital-college.edu
USER ftpuser password
PWD
CD sync
ASCII
PUT myout
QUIT
!EOJ
Thank you very much for all your help.
Subject: Solved my own problem
You may close this case now. I realised that as long as I have a datestamped outfile, I can then give it a static name for the ftp process. In any case, the process sitting on the ftp server requires a fixed filename. This is close to the best support I have received from any IT organization. Kudos to you and your colleague.
A very satisfied customer.
STExport/iX/Copyright Robelle Solutions Technology Inc. 1995-2003 (Version 4.7.02) WED, JUL 28, 2004, 7:24 AM Type H for Help. $date none $input myinput $output myoutput $xeq Error: Unable to open the output file SUPRFILEINIT Error 14 File: MYOUTPUT MPE XL File System Error: The file size exceeds two gigabytes. (FILE OPEN ERROR -468) File System message 468The answer is actually quite simple. STExport, by default chooses to build variable length files. (We did this as most import facilities are capable of handling variable length files and they are smaller to ftp to other systems.) MPE however, does not support large files that are variable length, so therefore the solution is to add "col fixed" to your script.
$col fixed $date none $input myinput $output myoutput $xeq
If you thought that was terrific, wait until you see this spring's version 4.8.
Suprtool version 4.8, with even more improvements, was released into production status during April 2004:
Suprtool now has a $Edit function for formatting data, just like COBOL has.
Suprtool now allows up to 255 $split functions per task. This means you can now parse almost any field-separated data entry.
Suprtool now reads Eloquence databases up to five times faster. (HP-UX specific)
Eloquence routines are now dynamically loaded, using the libraries on your system. (HP-UX specific)
Suprtool now has a $Findclean function to identify records with specific characters in it.
Dbedit for HP-UX now works on Eloquence databases; and on Oracle databses in the ImaxSoft version of Suprtool. (HP-UX specific)
The List command now has new options for listing to a file on HP-UX. (HP-UX specific)
The $Subtotal function has been re-written in order to use less resources and fix some bugs.
Suprtool's Open command can now connect to a remote Oracle database. (HP-UX specific)
Suprtool for MPE Change Notice
Suprtool for HP-UX Change Notice
All Suprtool customers will receive this new version at the time they renew their support for the year.
First off, Suprtool now has an edit-mask feature, which allows output fields to be formatted using COBOL-like edit-masks. Another key feature is that the Suprtool $split function can now be used up to 255 times in a single task. This means that you can now manually parse out fields from a typical Import file, such as PRN format.
To read all the details on these and other enhancements please visit the Manual section in the Robelle Library.
Using their libraries, which map TurboImage calls into appropriate Oracle calls, Suprtool is able to access ORACLE databases using the standard IMAGE/Eloquence commands (Base, Get, Delete, etc.), instead of through Suprtool's special Oracle commands (Open, Select, Add). Without OpenTURBO, some Suprtool functions are not available on Oracle (Update, Delete). This means little or no changes to your Suprtool scripts when migrating from MPE(TurboImage) to HP-UX(Oracle).
So far we have done limited testing of Suprtool with OpenTURBO, but we are encouraged by the results.
If you are interested in trying this version of Suprtool with OpenTURBO, please e-mail Neil Armstrong at neil@robelle.com. To contact Lee Tsai at iMaxsoft about this product, please e-mail: lee@imaxsoft.com.
What with customers migrating applications or re-engineering them for a homesteading future with fewer budget resources, there is a lot of demand for enhancements to Suprtool that allow more report-related and data-export features. For example, we recently added $Number to convert ASCII numeric values to Binary, and Clean to remove invalid characters from ASCII fields.
Now we are adding $Edit, a function to format output fields with a COBOL-like edit-mask. This means that you will be able to have floating dollar signs and all the rest. Suprtool's new edit-masks come in two styles: one mask for byte-type fields and another mask for numeric fields.
Placeholders and Format Characters
An edit-mask consists of "placeholder" characters, such as "9" for a numeric column, and "format" characters, such as "." for the decimal place. Sometimes an edit-mask character acts as both a placeholder and a format character, such as the "$" in floating dollar signs.
Byte-Type Formatting
For example, suppose you have data that is in ccyymmdd format in an X8 field. Here is how you would use a "xxxx/xx/xx" mask to format the data:
>in mydate >form File: MYDATE.TEST.NEIL (SD Version B.00.00) Entry: Offset A X8 1 <As you see in the example above, the placeholder character is the "x" and the "/" is the format character.> Limit: 10000 EOF: 2 Entry Length: 8 >def formatdate,1,10 >ext formatdate=$edit(a,"xxxx/xx/xx") >list >xeq >IN MYDATE.NEIL.GREEN (0) >OUT $NULL (0) FORMATDATE = 2003/09/24 >IN MYDATE.NEIL.GREEN (1) >OUT $NULL (1) FORMATDATE = 2003/09/24
For byte-type fields, the only placeholder character is the "x". The format characters are as follows:
B (space) / (slash) , (comma) . (period) + (plus) - (minus) * (asterisk) and Space.
You insert a space either by specifying a "B" or by putting an actual Space character in the edit-mask. An example of inserting a space might be the formatting of Canadian postal codes (e.g., V3R 7K1):
>in postal >form File: POSTAL.NEIL.GREEN Entry: Offset POSTAL-CODE X6 1 Limit: 10000 EOF: 2 Entry Length: 6 >def post1,1,7,byte >def post2,1,7,byte >ext post1=$edit(postal-code,"xxx xxx") >ext post2=$edit(postal-code,"xxxbxxx") >list >xeq >IN POSTAL.NEIL.GREEN (0) >OUT $NULL (0) POST1 = L2H 1L2 POST2 = L2H 1L2 >IN POSTAL.NEIL.GREEN (1) >OUT $NULL (1) POST1 = L2H 1L2 POST2 = L2H 1L2Overflow and limits
An edit mask is limited to 32 characters in total for both numeric and byte type fields. If data overflows the edit-mask, by default Suprtool will fill that field with asterisks. There is an option to have Suprtool stop when it encounters a formatting overflow:
>set editstoperror on
will force Suprtool to stop if there is data left over after applying the edit-mask. With byte-type fields, leading spaces do not cause overflow.
Therefore if your data consists of:
" L2H1L2"and your edit mask is:
"xxxBxxx"It is not an overflow since there are only spaces to the left of the "L". If the data was:
" JL2H1L2"an overflow exception would occur.
Numeric field edit-masks
Our edit-masks for numeric fields are patterned after those in COBOL. We provide four placeholder characters, each with a slightly different effect:
For example:
>ext a=$edit(int-field,"$$,$$$.99-") >ext b=$edit(int-field,"99,999.99-") >ext c=$edit(int-field,"cr99999.99") >ext d=$edit(int-field,"-$9999.99") >ext e=$edit(int-field,"**,***.99+") >ext f=$edit(int-field,"zz,zzz.99+") >list >xeq >IN FILE1SD.NEIL.GREEN (0) >OUT $NULL (0) A = $11.11- B = 00,011.11- C = CR00011.11 D = -$0011.11 E = ****11.11- F = 11.11- >IN FILE1SD.NEIL.GREEN (1) >OUT $NULL (1) A = $22.22- B = 00,022.22- C = CR00022.22 D = -$0022.22 E = ****22.22- F = 22.22-Signs
As shown in the example above, there are also numerous format characters for numeric edits, including four ways to specify the sign.
You can specify a sign, with +, -, or the typical accounting specification of "CR" and "DB". You will note in the example above that the "cr" in the mask was up-shifted to be "CR". This is because the entire mask is up-shifted as the mask is being parsed.
You can specify more than one sign in a numeric field edit, although Suprtool will give you a warning that having two sign edit-mask characters does not really make sense. Cobol gives a Questionable warning when compiling an edit-mask with two sign characters. Suprtool, will apply the sign in both places.
Keep in mind that most data has three states:
Any neutral data will not display the sign. If you specify a "+" sign in the edit-mask and the data is negative, it will of course display a "-" sign.
Decimal Places
For numeric-type edits, Suprtool attempts to adjust the data according to the number of decimal places in the edit-mask, when compared to the number of decimal places defined in the field.
For example if the data field has one decimal place, and the edit mask has two decimal places, then the data is adjusted:
Data and Edit mask:
102.3 ZZZZ.99will result in the final data being:
102.30Similarly, if the data has three decimal places and the edit-mask only has two, then the data will be rounded appropriately with the same rules as outlined in the $number function.
You can specify more than one decimal place in an edit-mask. However, Suprtool will print a warning and it will utilize the right-most decimal place for data alignment.
The decimal place character is defined by a set command:
>set decimalsymbol "."
If you define another character as the decimal symbol, Suprtool will use that character as the point to align the decimals. If you define a decimal symbol that is not an allowed edit-mask character with Set Decimalsymbol, Suprtool will assume that the field has zero decimal places and adjust the data accordingly.
Currency and Dollar signs
Suprtool edit-masks support both fixed and floating dollar signs. Logic for floating dollar-signs will be invoked if more than two dollar signs are defined in the edit-mask.
A floating-dollar edit mask attempts to put the dollar sign at the left most position of the significant data. For example if you have the following data and edit mask:
0001234.54 $$$$$$.$$
the data would end up as:
$1234.54
Suprtool will not however, put the dollar sign to the right of the decimal place. If you had the same edit mask and the data was, .09, the data would end up being formatted as:
$.09
Similarily, the $edit function will attempt to place the dollar sign correctly in most cases. For example Suprtool will not format data in the form of:
$,123.50
Suprtool, does attempt to fixup these cases and would format the data in the following manner:
$123.50
Overflow and floating dollars
If the number of digits in the data is equal to the number of placeholder dollar signs, then the dollar sign is dropped and not added to the edited field.
12345.50 $$$$$.99
would result in:
12345.50
Set CurrencySymbol
If Set CurrencySymbol is not equal to "$", then after the formatting has been applied, whatever symbol(s) are defined within the set command, are used to replace the "$" symbol in the data.
For example, if you have the Currency symbol set as "CDN".
>set currencysymbol "CDN"
Suprtool will replace the "$" after the edit-mask has been applied with CDN, provided there is room to the left of the dollar-sign.
It is recommended that if you are using multiple characters for the dollar symbol that you leave enough characters to the left of the symbol.
For example if the CurrencySymbol is defined as CDN, then you should leave two spaces to the left of a fixed dollar sign definition. If there is not enough room, to put in the currency symbol, then the dollar symbol is blank.
Overflow and limits
An edit mask is limited to 32 characters in total for both numeric and byte type fields. If data overflows the edit-mask, by default Suprtool will fill that field with asterisks. There is an option to have Suprtool stop when it encounters a formatting overflow:
>set editstoperror on
will force Suprtool to stop if there is data left over to place when applying the edit-mask. With numeric-type fields, leading zeroes do not cause overflow.
Also Suprtool, STExport and Suprlink now support environment variables. And Suprtool/UX supports Eloquence 7.0 features and expanded limits.
1234.45- -12345 -123.2134 12343can now be accepted and converted to any other numeric data type.
Consider the following data:
Item-number New-Price 12345 +123.45 34563 + 27.5 21312 + 1.545Suprtool can now read and convert the data in New-Price using the $number function. Let's say we want New-Price to be a double integer and currently occupies eight bytes starting in position six.
Here is the task you would use to convert the New-Price free-format number into a double integer.
>in mynums >def item-number,1,5,byte >def new-price-ascii,6,8,display >def new-price,1,4,double >item new-price-ascii,dec,2 >item new-price,dec,2 >ext item-number >ext new-price=$number(new-price-ascii) >out somefile,link >xeqThe $number function takes the free-format number and makes it into a valid display-type number. It will determine the decimal, sign and add leading zeroes. It will round the number to the defined number of decimal places.
In the case of 1.545 number, Suprtool will round the value to be 1.55, since the given number of decimal places is two and the preceding value is five or greater. If you have a whole number such as 54, with no decimal point the value becomes 54.00.
Suprtool will reject data that has:
More than one sign.You can control the character that defines the currency, thousand and decimal symbol for other currencies and formats using the following commands:
More than one decimal place.
Spaces in between numbers.
Signs that are in between numbers.
Characters that are not over punch characters.
Fields that when edited do not fit in the defined space for the display field.
>set decimalsymbol "." >set thousandsymbol "," >set currencysymbol "$"Suprtool in the above case will strip the currency and thousand symbols and use the decimal symbol to determine the number of decimal places. You can set these characters to any values you want but the defaults for each are used in the above set commands.
The Decimal and thousand symbols are only single characters. The currency symbol allows for four characters.
Armstrong/ Neil/ Patrick Green/ Bob/ Miller Fritshaw/ Elizabeth/ Edwards/ Janine/ Armstrong/Arthur/Derek
The $split function can extract each token into separate fields. The syntax for the $split function is:
$split(Field,Start Character,Occurrence,End Character,Occurrence)
The following task will $split the data in the whole field into three separate fields, left justified.
>in namefile >define lastname,1,30 >define firstname,1,20 >define middlename,1,20 >extract lastname = $split(wholename,first,"/") >extract firstname=$trim($split(wholename,"/","/")) >extract middlename=$trim($split(wholename,"/",2," ",2)) >out names,link >xeq
The first extract statement tells Suprtool to extract the bytes from the field wholename, starting at the beginning (first keyword), and stopping at the "/" character.
The second extract statement, tells Suprtool to extract the bytes between the first occurrence of the "/" character to the next occurrence of the "/" character, and then that string is trimmed of spaces as it is nested within the $trim function.
The third and final extract statement tells Suprtool to extract the bytes beginning with the second occurrence of the "/" character to the second occurrence of the space character.
If the target field is not long enough to hold the data Suprtool will abort with an error. You can easily prevent this from happening on blank fields by nesting the $split statement within a $trim or $rtrim function.
This feature can also help you read certain fields that are comma delimited with fixed columns.
This feature was added in Suprtool version 4.6.03, which is available as a pre-release.
A sample use of the $subtotal function could be:
>def mytotal,1,14,packed >get orders >sort order-number >ext order-number >ext part-number >ext description >ext sales-amount >ext mytotal = $subtotal(sales-amount,order-number) >out sales,link >xeq
This would result in a file containing a running subtotal in the field mytotal for a given order-number. The target data must be a packed field with 28 digits, in order to help avoid overflow issues.
You could then generate a simple report with the simple Suprtool commands:
>in sales >list standard >xeq
The basic syntax for the $subtotal function in the Extract command is:
extract targetfield = $subtotal(field,sort-field)
You must specify the sort command before referencing the sort-field in the $subtotal function.
You can subtotal up to ten fields per pass and the $subtotal function is also available in the If command, however, is of limited use.
In Suprtool 4.6.03, the Clean command has improved syntax to specify which characters to look for and replace. You can specify special characters Decimal 0 thru Decimal 31 via the command:
Clean special
You can also specify a range or characters by using the following syntax:
Clean "^0:^31","^240:^255"
This enhancement makes it much easier to define characters to search for and Clean. This enhancement is available in Suprtool 4.6.03, which is available for download from our web site.
I want to concatenate 3 fields to one field, enthr,":",entmin. How do I make the 3 fields into one, with no separators, if I output as a comma delimited field?
Answer:
You can concatenate three fields by doing the following:
DEFINE myfield,1,30,byte EXTRACT myfield = $trim(enthr) + ":" + $trim(entmin)The resulting field will have the contents of the three fields with the both leading and trailing spaces trimmed. For more details and examples on string functions please see the following page on our website: www.robelle.com/tips/st-strings.html
Hint: If your fields are of type Z (zoned numeric display), you need to use Define to redeclare them as type Byte before you can use string functions on them in Suprtool.
>in mysdfile >out myfile,num,dataMany users used this to return the data to the original retrieved order after doing some other sorts or data manipulation.
The above could would generate an output file called myfile. However, you would lose the SD information and you can only put the number at the beginning or the end of the data. Suprtool now has a $counter function that allows you to place a counter at any spot in the record, as well as preserve the SD information.
>in mysdfile >def mycount,1,4,double >ext field1 >ext field2 >ext mycount=$counter >ext field3 >ext otherfield >out myfile,link >xeqThe file myfile will be self-describing (meaning that you can feed it to Suprlink and STExport with the structure information intact) and it will contain the fields field1, field2 and mycount. The field mycount is defined as a double integer, since this is the only field type that the $counter function can use. The mycount field in each record will have a unique ascending number starting with one.
We have added two methods to clean your data, you can use Suprtool to clean an individual byte type field, or STExport to clean all of the byte-type fields for a given file that you are exporting.
Sometimes un-printable or extraneous characters get stored in files or databases that have no business being there. This may be some tab characters in an address field or perhaps and embedded carriage return or line-feed.
Suprtool now supports the clean function which will replace individual characters for a given byte field.
There are three things that Suprtool needs to know in order to "clean" a field. Suprtool needs to know which characters it needs to clean, what character it needs to change the "bad" characters to, and also what field does it need to clean.
clean "^9","^10","."
will tell Suprtool to replace the tab character (Decimal 9), Line Feed (Decimal 10), and a period to whatever the Clean character is set to.
The CLean command takes both, decimal notation and the character itself, however, it is probably most convenient to use the Decimal notation for the characters that you wish to clean. The Decimal notation is indicated by the "^" character.
>set CleanChar "."
This will set the character to replace any of the qualifying "to be cleaned" characters to be a period.
ext address1=$clean(address1)
shows how to clean the field address1. You do not necessarily need to have the target field be the same as the source field.
def new-address,1,30 ext new-address=$clean(address1)
>base mydb,1,; >get customer >clean "^9","^10","^0","^7" >set cleanchar " " >update >ext address(1) = $clean(address(1)) >ext address(2) = $clean(address(2)) >ext address(3) = $clean(address(3)) >xeq
The above task will look at the three instances of address and replace the tab, linefeed, null and bell characters with a space.
$ in mysdfile $clean "^9","^10","^0","^7" $set cleanchar " " $out myexport $xeq
Since the Cleanchar is by default set to space, the above task could simply be:
$in mysdfile $clean "^9","^10","^0","^7" $out myexport $xeq
This feature is introduced in pre-release 4.6.02 - to arrange for a pre-release test copy of Suprtool, email neil@robelle.com
>Get History {specify dataset with J2 field} >Define New-field, 1, 6 , display {new display field with less digits} >Item J2-field, decimal,2 {original field has decimal places} >Item New-field, decimal,2 {same decimal places as original} >Extract New-field = J2-field >List Standard >Xeq
You can use a similar trick to reduce the column width for a character field, but you don't need the conversion feature of Extract, only a Define to change the length. To truncate the customer name to just the first 26 columns, you would use these commands:
>Get Customerdata >Define Shortname,Name,26,byte >Extract Shortname >List Standard >Xeq
If you no longer have the original data for recreating the PRN file with STExport, you can add a header record another way:
setvar recsize finfo("yourfile","recsize") setvar eof finfo("yourfile","eof")+1 build newfile;rec=!recsize,,f,ascii;disc=!eof
file newfile=newfile,old echo "your header","etc." > *newfile reset newfile
suprtool >in yourfile >out newfile,append >exit
You can do it in Suprtool, but there is a trick to it. You must do a Define on the field, then also do some arithmetic.
Here's a file with one field, X4, that contains leading spaces (i.e., " 23"):
>fo data File: DATA.PAUL.TECHSUP (SD Version B.00.00) Entry: Offset X4 X4 1 Limit: 3 EOF: 3 Entry Length: 4 Blocking: 64 >in data;o*;x 23 5 123 IN=3, OUT=3. CPU-Sec=1. Wall-Sec=1.Now we Define a new DISPLAY field and perform and an arithmetic operation on it. If we multiply the field by one, it retains the same value. However, numeric results on Display fields in Suprtool always have leading zeroes, so the result is the same value but now with leading zeroes:
>in data >def z4,1,4,display >ext z4 = z4 * 1 >out data2 >x Warning: Using Output DATA2,Link IN=3, OUT=3. CPU-Sec=1. Wall-Sec=1.Now let's look at the file, and what the data looks like now:
>fo data2 File: DATA2.PAUL.TECHSUP (SD Version B.00.00) Entry: Offset Z4 Z4 1 Limit: 3 EOF: 3 Entry Length: 4 Blocking: 64 >in data2;o*;x 0023 0005 0123 IN=3, OUT=3. CPU-Sec=1. Wall-Sec=1.
define tiny,1,1,integer
Warning: Length of 1 is of limited use for the data type INTEGER
You will find that you cannot do the usual arithmetic operations on these 1-byte integers. The solution is to extra them as 2-byte integers by padding them with a leading binary zero, then do another pass through the output file to work with the resulting 2-byte integer. For example, this is what you would do if you need to add two 1-byte integers together:
input yourfile define tiny1,1,1 define tiny2,2,1 ext ^0,tiny1, ^0,tiny2 out tmp xeq input tmp define num1,1,2,integer define num2,3,2,integer define sum,1,2,integer ext sum = num1 + num2 out result xeq
We didn't make "tmp" a self-describing Link file because we needed to redefine the 4 byte fields as two 2-byte integer fields in the second pass, so passing on the structure from the first pass was of limited value. It was clearer and easier to just Define the num1 and num2 fields from scratch.
Do the debits equal the credits?
:run suprtool.pub.robelle >base fms.gl,5,reader >get d-transactions >output $null >total trans-amt >xeq Totals (SUN, JAN 29, 1995, 2:56 PM): TRANS-AMT 81451+ IN=6, OUT=6. CPU-Sec=1. Wall-Sec=1.By default the results of the Total command are displayed on the screen ($STDLIST), but can be appended to the output file. To do, this you need to turn Set Squeeze Off and add a new Total command as follows:
total $file filename,appendThis tip is taken from the Robelle tutorial Ensuring Data Integrity With Suprtool (PDF Format).
I need to code a script to extract data, based on date.
Everyday my job has to run to extract data, from today to 7
days before based on the field SYSTEM-DATE.
Below is my code. Please let me know, is there any optimised
coding for this spec.
I am having a field SYSTEM-DATE declared as X(8), which is
holding the date in "YY/MM/DD" format (including the "/"s)
GET LSILM002
DEF YY,SYSTEM-DATE[1],2
DEF MM,SYSTEM-DATE[4],2
DEF DD,SYSTEM-DATE[7],2
EXT DCCODE
EXT PRODUCT
EXT RECEIPTNO
EXT SYSTEM-DATE
EXT YY
EXT MM
EXT DD
OUTPUT GRNSIP03,QUERY
X
INPUT GRNSIP03
DEF DCPRODRECPT,1,18
DEF YYMMDD,47,6
ITEM YYMMDD,DATE,YYMMDD
IF ($STDDATE(YYMMDD) >= $TODAY(-7) &
AND $STDDATE(YYMMDD) <= $TODAY)
EXT DCPRODRECPT
EXT SYSTEM-DATE
SORT DCPRODRECPT
OUTPUT GRNSIP04,QUERY
X
Our Reply:
Your code requires that the entire dataset is first copied to a flat file, rearranging the date to a format that Suprtool supports. If you have a large number of records, then the additional pass will be expensive, performance-wise.
Unfortunately, I know of no way to remove the "/" slashes from the date field without a separate pass, as you have done. But you could improve this process by treating it as a regular character field, as the logical date values would collate correctly. You then just need to generate the comparison values in a dynamic way in the jobstream. This can be be done by constructing the "if" command in two short pre-passes, like this:
input catalog.pub.sys define dt,1,6 item dt,date,YYMMDD extract dt = $today(-7) extract dt = $today output dtfile,temp numrecs 1 xeq input dtfile define fromdateyy,1,2 define fromdatemm,3,2 define fromdatedd,5,2 define todateyy,7,2 define todatemm,9,2 define todatedd,11,2 extract "if SYSTEM-DATE >= '" extract fromdateyy,"/",fromdatemm,"/",fromdatedd,"'" extract " and SYSTEM-DATE <= '" extract todateyy,"/",todatemm,"/",todatedd,"'" output ifcmd xeq
This creates a file called IFCMD that looks like this:
:print ifcmd
if SYSTEM-DATE >= '01/06/12' and SYSTEM-DATE <= '01/06/19'
You can reference this in your main Suprtool task, just "use ifcmd".
This will mean 2 (small) prepasses with 1 record each, rather than 1 prepass with all the data records, so should run much faster.
Suprtool's If command uses a technique called short circuit evaluation. Simply put, if a record does not qualify based on the first criteria, then Suprtool does not bother checking the second criteria.
Let's say you have the following Suprtool task:
base mydb get order-details table mytable,order-no,file,orders if $lookup(mytable,order-no) and order-status = "OP" output qualords,link xeq
Suprtool will do the lookup in the table to see if it has a qualifying order-no; if it does then it will go onto the next criteria and check the order-status.
If the order-no does not exist in the table, then Suprtool will not bother checking the order-status; the If command will "short-circuit" and Suprtool will move on to the next record.
Given this information you can improve the performance of this particular selection based on the amount of work that the If command needs to do in order to check the condition. Given the $lookup function, Suprtool has to go to the table and do a binary search of the orders to determine if the order-no is in the table. For the order-status, Suprtool just needs to look at the single value, a much less expensive operation. Therefore we can optimize the above If command to be:
if order-status = "OP" and $lookup(mytable,order-no)This way Suprtool only has to do the table lookup for the orders with a status of "OP".
A general guide that you can use to determine how to code your If commands is to try to put the functions that have a $ in front of them at the end of your If command, with the exception of $read ($read is a command line feature only).
Without doing any emperical timings, here is my best guess as to the order of how expensive a particular function would be, from most expensive to least expensive, in CPU terms. Of course a lot would depend on how many records were in the Table, or how long the string is for the $upper and $lower functions.
$LOOKUP (most overhead) $UPPER $LOWER $TRIM $LTRIM $RTRIM $STDDATE $INVALID $DAYS $TRUNCATE $ABS $NULL $TODAY $DATE (least overhead)There should be minimal performance gain in putting $today and $date functions at the end of an If statement, since they perform very little work at "execute" time.
So as a general rule, and if you can, put the $ functions at the end of your If command.
Neil@robelle.com, Suprtool developer.
June 7, 2001
>base sales.db,5,reader >get d-invoices >if price * quantity <> amount >list standard device LP & > title "Price * Qty not = Amount" >xeqUse this tip for month-end jobs that prints exception reports.
Tip taken from the Robelle tutorial Ensuring Data Integrity With Suprtool (PDF Format).
Sent: Monday, April 30, 2001 To: HP3000-L List Subject: Re: Esc and other trash in IMAGE From: Thisted, Kristian" Kristian.Thisted@SKANSKA.SE
Good day to you all!
This has probably been up before but I still have a problem which many of you have encounterd before. We are about to migrate to another DB and some fields contains characters like TAB and ESC. They are of course not accepted by the receiver.
How can I clean my Image fields? Suprtool?
(Of course there are several ways to clean efter the export.)
Hilfe!
Reply from Robelle:
Suprtool would be an excellent choice:
To find and clean up TABs in Suprtool:
>base mydb,1,Above I've used the caret "^" to find 'Character constants'. Usually the corruption occurs in the first byte of the field, so I defined as a single byte. Then Suprtool can update the subfield 'a' with a space.>get mydataset >define a,myfield,1,byte >if a = ^9 {caret 9 = decimal 9 = tab} >update >extract a = " " >xeq
You could also change the selection command to "if a < ^32" to get all the 'unprintables' below spaces.
Here's an example of a daunting report:
ACCT CONTACT-NAME ADDRESS-TYPE 206J Fred Flintstone MAIN 206J Fred Flintstone INVOICE 206J Fred Flintstone TAPE 206K Freakazoid! MAIN 206K Freakazoid! INVOICE
We have three ADDRESS-TYPE records for Fred Flintstone and two for Freakazoid. It would improve the report readability tremendously if we could eliminate the redundant printing of Fred Flintstone and Freakazoid.
To simplify this report, we need to complicate the Suprtool output phase. In this example, we'll need three passes to change the duplicate data into spaces. Let's first get the records that are the originals by using Dup None. We won't forget to Set Squeeze Off so that we have enough room to append records in the second pass.
>get d-address >define my-acct acct-no >extract my-acct,acct-no,contact-name >extract address-type >output bigfile,link >sort my-acct >sort contact-name >dup none key >set squeeze off >xeqNext, we'll get the records that are duplicates and extract the duplicate data as spaces. Notice that we are extracting the account number twice in order to preserve the sort information for the last pass.
>get d-address >define my-acct acct-no >extract my-acct,acct-no=" ",contact-name=" " >extract address-type >sort my-acct >sort contact-name >dup only key >output bigfile,append >xeqLet's take a look at our report.
>in bigfile >sort my-acct >extract acct-no / address-type >list standard >xeq ACCT CONTACT-NAME ADDRESS-TYPE 206J Fred Flintstone MAIN INVOICE TAPE 206K Freakazoid! MAIN INVOICEAnd there we have it -- a much more readable report. Of course, we could also customize the headings, but I'll leave that as an exercise to the reader.
I would like to use the OUTPUT file,PRN command to format data to be passed to another application. Is there a way to force a write or line feed within the output utilizing the EXTRACT command?
Example:
>GET CUST >OUTPUT file,PRN >EXT cust-id >EXT name,address,city,state,zip >EXT contact,phoneto somehow produce a file looking like:
"1111" "ABC Company","11 W Pine","St Paul","MN","55114" "Joe Smith","555-555-5555" "2222" "My company"," "," "," " "Jane Doe","555-555-5555"
And our Robelle Technical Support answered:
It is possible, Kurt, if you do it in two steps. First you create your file using the regular
output myfile,prnThen you redefine that file by designating the three different lines you want and use the LIST command, instead of the OUTPUT command, but redirect the LIST to a disc file instead of a printer.
The list command has the ONEPERLINE option that allows what you want:
For example, if this is what your PRN 'myfile' contains:
....+....10...+....20...+....30...+....40...+....50...+....60...+ "1111","ABC Compan","11 W Pine","St Paul","MN","55114","Joe Smith","555-555-5555" "2222","My company"," "," "," "," ","Jane Doe ","555-555-5555"Note that the records are fixed-length, so you can define new fields for particular "pieces" of the records:
>in myfile >def line1,1,6 >def line2,8,47 >def line3,56,26 >build kurt;rec=-80,,f,ascii;disc=2000 >file suprlist=kurt;dev=disc >list oneperline noname norec noskip >ext line1,line2,line3 >xeq IN=2, OUT=2. CPU-Sec=1. Wall-Sec=1.The oneperline tells suprtool to put each extracted field on its own line. ( noname norec noskip suppresses the field name, record number, and blank line between records.) Here's the result:
>in kurt;o*;x "1111" "ABC Compan","11 W Pine","St Paul","MN","55114" "Joe Smith","555-555-5555" "2222" "My company"," "," "," "," " "Jane Doe ","555-555-5555" IN=6, OUT=6. CPU-Sec=1. Wall-Sec=1.
get dataset ext id, zone, amount total amount total $file $list list standard device LP xeqWhat if you also want to include a count of the number of records? You can do this easily with two passes. The first pass adds a field with a value of 1, and the second pass totals that field, effectively providing a count of the number of records.
get dataset ext id, zone, amount define count,1,4,int ext count = 1 output tempfile,link xeq input tempfile ext id\amount total count total amount total $file $list list standard device LP xeq
>base store.demo,5,reader >get d-sales {open a dataset} >sort product-no {define a sort key} >duplicate none keys count total sales-qty sales-total >out salessum,link {Output to a link file} >xeqThese commands produce a file with a summary by product-no (because that is what we sorted by). The file will contain a count of the number of records for that product-no, and totals for the sales-qty and sales-total. The count field is called ST-COUNT and the totals have field names of ST-TOTAL-1 and ST-TOTAL-2, which are at the end of each record. And because it is a self-describing file you can easily pass it on to Suprlink or STExport or Suprtool for further processing. That's all there is to it.
For example, FOLLOWUP-DATE might need to be a week after SHIPPED-DATE. With the new $Days function in Suprtool, you can easily generate a date that is N-days before or after any date. You only need to use two Suprtool tasks. The input date can be in any supported format, but the output date will be in yyyymmdd format.
The first task generates the desired date, but it will be in Julianday format. For simplicity, we assume that the file only contains the date, in yyyymmdd X8 format.
>input YOURFILE >def shipped-date,1,8 >item shipped-date,date,yyyymmdd >def jdate,1,4,int >item jdate,date,julianday >ext shipped-date >ext jdate = $days(shipped-date) + 7 >out tmpfile,link >xeqThe second task converts the Julian-format date to yyyymmdd format.
>in tmpfile >def followup-date,1,8,display >item followup-date,date,yyyymmdd >ext shipped-date >ext followup-date = $stddate(jdate) >out result,link >xeqNow you have a self-describing file with the following information
SHIPPED FOLLOWUP 19981231 19980107 19991230 19990106 19990228 19990307
$ input foo $ quote none $ {any other options} $ output foo2 $ exit
define anydate,1,4,int get dataset item date-A,date,SRNChronos item date-B,date,calendar ext anydate = $stddate(date-A) ext anydate = $stddate(date-B) list standard xeq