No problem, just have it print catalogue cards to a print file. TextPipe will read the print file and extract the data for you. Change date and time formats. Read foreign language dates. Generate Day of Week. Create four digit years from two digit years with intelligent century choice for Year Y2K conversions. Translate to and from Julian Day Numbers days since B.
Unit Conversion? The information you are importing contains data in square meters, but your database requires it in square feet? No problem, TextPipe has conversions built in for distance, area, volume, weight, energy, temperature and time. If it doesn't have the unit conversion you need, you can provide your own conversion rules. Export numeric data as decimal, hexadecimal and scientific formats. With or without leading zero file for fixed length fields.
Analyze name information. Parse out names into title, first name, middle names, last name, name suffix components. Correctly recognize multi-word surnames like "Van der Pohl" and "de la Salle". Optionally strip accents for use with software which cannot handle accented characters. Name Parsing? You can hardly use it as a mailing list in that format. The representation of single- and double-precision floating-point values are not the same in this format as in the source IBM format,. The source and target compilers may make different choices regarding arithmetic expressions using floating-point variables order of computation, precision of intermediate variables, rounding mode, etc.
The textual, printable representation of the same floating-point value may be different on both platforms use of scientific notation, number of digits before and after the decimal point, etc. However, in our opinion, these differences of behavior are largely acceptable, if you keep in mind that floating-point arithmetic is only an approximation of the mathematical exactness: you will simply get a different approximation on the target machine than on the source one….
To help you deal with this issue, we performed various experiments using varied floating point-values and computations, and we found out that:. So the tradeoff between range and precision is different on both platforms. When the same computations are performed on ranges available on both the source and target platforms, the relative error between the observed results as printed by DISPLAY is always less than 10 -6 when using COMP-1 variables and less than 10 using COMP-2 variables.
This is not a definitive proof that everything works fine, but it is at least an encouraging indication. Given these results, it seems that one can always reproduce the same behavior on the target as on the source, up to insignificant approximations, possibly by replacing some COMP-1 variables by COMP-2 ones.
Thus, you will save the run-time format tests. On the source platform, a variable of type POINTER occupies 4 bytes in memory 32 bits ; on all the supported target platforms, based on bit operating systems, such a variable occupies 8 bytes. This may lead to various kinds of differences of behavior for which we take no responsibility:.
However, we strongly discourage such machine-dependent "hacks". Structure alignments: if a POINTER variable is part of a structure containing variants redefinitions , and if the different variants sub-structures are designed so that one particular field of one variant must be aligned with have the same location as some other field in some other variant, then this property must be maintained after the POINTER variable changes size: compensation fillers must be inserted, etc.
Again, this must be handled manually. Note that such intended alignments must be maintained across redefinitions, but also across MOVE s to other structures. On both the source and target platforms, a program parameter defined in the Linkage Section and listed in the USING clause of the procedure division which is not actually passed by the caller, either because of an explicit OMITTED item is passed instead or because the caller passes less arguments than the callee expects, appears to have a NULL address in the callee.
However, when the callee fails to check the parameter address and the actual address is NULL , the source and target platforms may behave differently. For instance:. On Linux however, although NULL is also address 0, this is not considered as a legal address, so when the parameter is accessed, the program crashes.
It is not possible to automatically handle this situation and the associated differences of behavior, because even if the converter could insert address checks, what should it do when the test fails?
Furthermore, the set of subprograms and parameters which are really affected by this problem is a very small minority of all subprograms and parameters and it would be ugly to insert such address checks for all of them. This will have to be handled manually, possibly using post-translation. The representation of the NULL pointer value may vary from one platform to another, in particular between the source and target platforms — if only because they don't have the same size, like every other pointer value.
In consequence, every program which assumes a specific representation for this value, for instance by "casting" it to or from some binary integer value, may have a different behavior from one platform to another.
The COBOL converter cannot handle this issue by itself, automatically, and it will have to be handled manually. Anyway, we strongly discourage such machine-dependent "hacks".
The input components are all the COBOL programs in the asset, after they have been parsed by the cataloger. In addition to the restrictions imposed by the cataloger no nested programs, etc.
All the anomalies reported by the cataloger must be fixed. Otherwise, there is a risk that the conversion is incorrect, or even that the Converter fails crashes. See force-translation Clause , however. The source format for all COBOL source files main programs and copy files must be fixed format with a numbering area columns and a comment area C columns physically removed.
This must be done before cataloging. The data migration process must have been run before COBOL conversion is started, because the latter depends on the former, for instance to decide which files will be migrated into relational database tables; see the Process Guide for more details.
This dependency is concretized by the fact that the file migration tools generate some of the configuration files read by the COBOL converter. The system description file describes the location, type and possible dependencies of all the source files in the asset to process. As such, it is the key by which the cataloger, but also all of the Rehosting Workbench tools, including the COBOL Converter, can access these source files and the corresponding components.
Because of the need to have COBOL source files with the numbering area and comment area C removed, option Cobol-left-margin must be set to 1 one and option Cobol-right-margin must be set to 66; these are the default values.
It defines various "scalar" parameters influencing the conversion and points to subordinate files containing "large" configuration data, such as renaming files. Many of the parameters configurable in this file can also be set on the command line; in this case, the command-line value overrides the configuration-file value.
Although not mandatory, it is advisable to store this file in the same parameter directory as the system description file. The contents of the main conversion configuration file is a free-format, unordered list of clauses, each beginning with a keyword and ending with a period.
Some clauses take one or more arguments, others are boolean clauses with no argument. The keywords are case-insensitive symbols; the arguments are integers, symbols or case-sensitive strings. Spaces, new lines, etc. Comments can be written in the configuration file in two ways:.
Start with a sharp sign " " and extend to the end of the same line. This clause specifies the location of the directory that will contain the complete hierarchy of target files, for both programs and copy files.
The Master-copy directory is related to the copy reconciliation process, see Command-Line Syntax. The dir-path is given as a string. It can be either an absolute path or a relative path; in the latter case, it is relative to the directory containing the system description file, as usual for the Rehosting Workbench tools. Sql-rules : target-sql-syntax. This clause specifies the target SQL syntax.
Its value can be oracle or none. If the value is none , the sql code in the source files isn't translated. It's transferred as it is to the target components. The default value of this clause is oracle. In the latter case, it's not necessary to set sql-rules to oracle in the configuration file. These clauses direct how the file extensions for the converted programs main source files and copy files are determined:. If the keep-same-file-names clause is given, the converted programs and copy files will have the same file extensions as the original files in the source asset as cataloged.
The other clauses, if given, will be ignored. If the target-program-extension clause is given, then the converted programs will have the given file extension,. If the target-copy-extension clause is given, then the converted copy files will have the given file extension. By default, the converted programs will have the file extension cbl and the converted copy files will have file extension cpy.
The default value of 2 is fairly verbose, higher values are even more verbose, value 1 only displays important error messages. This clause specifies that the copy-reconciliation process crp is to be deferred until after the conversion is completed; this allows COBOL conversion to run in multiple concurrent processes. By default, in the absence of this clause, the copy-reconciliation process is executed incrementally immediately after each program is converted, which mandates single-process execution.
See the copy-reconciliation process below for more details. This clause directs the COBOL converter to try to convert even those programs that contain FATAL errors although without any guarantees: the converter may produce incorrect results or even crash.
By default, in this case, the converter refuses to work on this program and skips to the next one. This clause specifies the location of the subordinate configuration file containing information to rename copy files, see the copy-renaming Configuration File below.
The file path is given as a string. This clause specifies the location of the subordinate configuration file containing information to rename sub-programs and their calls, see the Call-Renaming Configuration File.
This clause specifies the location of the subordinate configuration file containing the description of manual transformations to apply after the Rehosting Workbench Converter, see the Post-Translation Configuration File.
This clause specifies the name, as a symbol, of the procedure to call to cause a definite termination of the program. This name is used to force termination in situations in which the IBM compiler would force termination but not the target compiler, such as size errors in arithmetic statements.
The default name is. These clauses specify the location of the two subordinate configuration files containing information regarding file-to-Oracle conversion.
These files are generated by the Rehosting Workbench File-to-Oracle conversion tool, as respectively the Conv-ctrl-file or the Conv-ctrl-list-file and the Alt-key file.
Only one of the first two clauses must be given: either the conv-ctrl-file clause or the conv-ctrl-list-file clause, but not both. This clause specifies the location of the top-level subordinate configuration file containing information about relational DBMS conversion from DB2 to Oracle. This clause specifies the location of the subordinate configuration file containing information to rename COBOL identifiers which happen to be keywords or reserved words in the target COBOL dialect, see the keywords File for more details.
For instance, the statement:. This allows more control and more flexibility on how programs acquire their current date. For instance, during regression tests, it is necessary to run migrated programs with the same current date as when the source programs were run; these sub-programs to be supplied by the Rehosting Workbench users, according to their requirements will allow this.
If any of these clauses is not specified, the corresponding statements are not transformed. This clause specifies the location of the subordinate configuration file containing the list of DB2 stored procedures called directly from COBOL, see the stored-procedure File for more details.
This clause enables the transformation rule which removes the schema qualifier from every SQL identifier which has one. The resulting program will hence rely on implicit schema qualification. This is generally not needed, and possibly even not desired, but it is useful if you want to run the program concurrently in multiple environments connecting to multiple databases or schemas , for instance in multiple test corridors.
There exists a command-line option of the same name see cobol-convert Command which has the same effect as this clause, and which is more flexible to use. So we believe that the configuration-file clause will be seldom used, except perhaps in projects in which the TP and batch parts of the asset are well identified and strictly separated in the migration project. See purely-sequential Configuration File for more details. The file-path is given as a string. When present, this clause specifies that the what-string containing conversion timestamp and converter version information, which the converter normally inserts at the beginning of every converted file, is not to be printed out in this execution.
This will be seldom used, unless you really want to hide the fact that your application is migrated using the Rehosting Workbench! When present, this clause specifies that COPY directives referencing copy files which no longer contain useful COBOL code after conversion are to be commented out; by default, these directives remain active. This applies for instance to copy files defining whole FD paragraphs for files which migrate into database tables.
See the sql-return-codes Configuration File for more details. This file is associated with the rename-copy-map-file Clause. INX too.
DAT file to download. It has. DAT and. INX files see e. Thx for all your advices I checked most of them still checking :. The 'dat' files appear to contain mostly comp numeric fields, not binary. Comp numeric fields store some number of digits in half the space of the picture, for example: 10 digits in 5 bytes. Impossible to say what the values mean or where they start and stop.
The inx files are meaningless to the data in question. Then I tried to analyze the file in HEX. The most frequent are characters 00 and the characters 0F occure regularly. In the previous post I examined the file Bp If you multiply record length with number of records, i. This is not Micro Focus indexed file format, must be something else. The X'0F' is in a way a delimiter. This is the last byte of a packed decimal field.
The 0 is the last rightmost digit of the value, and the F is the sign none. C, D, and F are most common. I checked it and nothing ReadBytes Convert. ToInt32 br. And found nothing. But thx you all for suggesions. If somenoe come up witch sth else let me know. DAT is indexed by the 10 first characters of the record. You are right: hex F is not a general field separator. But the record can contain several data: character fields, numeric fields unpacked and packed numeric fields.
The only thing I can see in the record is, that first field is character field with length Improve this question.
Rockn Rockn 47 1 1 silver badge 7 7 bronze badges. Directly importing? The file may also be indexed which you'd have to be able to understand to handle. Add a comment. Active Oldest Votes. You may need to unload the files In general while some Cobol files will be suitable for loading into a Database.
Other will require programming: Multi-Record files - probably split in to several different tables Files with redefines Accessing the data in the files Loading the Files into a Database is either going to be expensive or time consuming or both : There are some commercial that provide access to Cobol Files I would imagine they are expensive.
Improve this answer. Community Bot 1 1 1 silver badge. Bruce Martin Bruce Martin I suggest to add RecordEditor found at SourceForge to the list. It seems like way more trouble than it would be worth.
0コメント