Intergraph has an internal database for keeping track of all kinds of software issues, if you file a Trouble- or Change Request, it will end up in this database. This database will become public available in the 3rd quarter of 2013 according to Intergraph sources at Hexagon 2013. Also, if a problem gets fixed, the solution will be described so you know if applies to you. All this information will be public accessible and indexed by Google, so if you are facing a problem in G/Technology (or any other Intergraph product) google will find it and it will show up in the results of your search.
G/Technology
Debugging G/Technology 10.2 with Visual Studio
In an earlier post Stephan mentions that it was not possible to debug G/Technology 10.2 in “edit and continue” mode. You could only attach Visual Studio to the G3E.exe and debug. But you cannot break the code and then make a small change. For every change you need to close G3E.exe, compile your project, start G3E.exe and attach Visual Studio again. When you try to start-up the 10.2 G3E.exe in debugging mode from Visual Studio we got the following error message:
But here is the way to do that. The first thing you have to do is adjust the G3E.exe.config located in the \GTechnology\Program directory. You should add the following three lines after the <configuration> tag.
<?xml version="1.0?" encoding="utf-8" ?> <configuration> <startup useLegacyV2RuntimeActivationPolicy="true"> <supportedRuntime version="v2.0.50727"/> </startup>
After that you must adjust the Path-variable. Or check if the path is already set in you system settings. The following path needs to exist in the path variable for a 64bit environment it is: C:\Program Files (x86)\Common Files\Intergraph
. If you are using a 32bit environment you should lose the (x86) part.
After setting the path, it is important to reboot the machine! Now you can fully debug your applications in Visual Studio.
If you are not an administrator on you machine and cannot change the path variable for some reason, you could also add the path to your working folder in Visual Studio. But if you have multiple projects defined in Visual Studio, you should change this for all projects.
A would like to have G/Technology development environment
Every time I do a G/Technology project I like to discuss how things need to be done. One aspect that always turns up is how limited you actually are as a GTech developer, due to Intergraph. If you or your teammates don’t know how a certain aspect works, your doomed. Even if you know how it works, realising it in G/Technology may be very time consuming. If you are working for an existing customer, he’ll just have to pay more, but if you’re in a benchmark situation, this is killing. G/Technology is complicated due to the many aspects which come together, and new developers are desperately needed, but I don’t see them. If the current development tools don’t improve, I can’t see much reasons why developers would like to start with GTech, so this sad situation won’t change. Let me describe what would be my ‘would like to have’ G/Technology development environment which is very likely to change this:
There’s a nice GUI for manipulating metadata
If you need to modify metadata, you need to create SQL-scripts to change metadata tables in the database and you need to know the relations in these tables, allowed values, how these values relate to each other, etc, etc.. Developers shouldn’t spend their time figuring out what the initial authors of the software might have meant, they need to be provided with a tool which leaves no room for doubt and really aids in helping a job done. This tool is a Graphical User Interface to setup your metadata.
You can store metadata in the database or in a folder
The same graphical tool lets you choose wether you want to store your metadata directly in the database or in the published metadata-folder. Storing your changes straight in the metadata folder is useful for quick prototyping but is not suitable for multi-user development, then you would choose to store your changes in the database. If you choose to store metadata straight in a folder, you have a tool to synchronize this folder to the database. This tool creates a set of scripts to create a GTech metadata-schema from scratch from the metadata folder. A nice sideffect of this tool is that you can always recreate your metadata evironment from scratch which from a quality point of view is very desirable and was very common in the various Microsoft web projects I have done, but which I have never seen with a GTech project.
Published metadata is human readable
The only metadata read at runtime is the published metadata. If you need a quick fix on this, you could use the GUI-tool discussed earlier, but this may not be installed on server systems. Also, although there is a tool to read and change the binary Microsoft ADODB based binary files .raw files in the metadata-folder (viewmetadata.exe, ask your Intergraph representative for more details) you want to be able to read the runtime metadata with a tool present on every system, Notepad. Therefore, the published metadata folder should contain xml-files presenting the published metadata.
You can use G3E.exe as your debugging process to attach to
The 10.2 version will not let you use G3E.exe as your startup executable. Before you can start your debug session, a whole lot of crashes have appeared, and debugging isn’t possible. This worked just fine in version 10.1
Oracle packages are readable
Wrapping packages is out of the question, Intergraph is proud of the product and wants people to look at it to give positive feedback and to learn from it.
Generation packages are stored in a separate schema
All packages needed for generation are placed in a seperate schema. No objects other then the ones needed for a runtime GTech environment are allowed in a production schema.
You have many examples to study
Whenever Intergraph releases functionality, the developer in charge needs to publish his testing scenarios along with sources, reproducable, on a public accessible site.
There are many blogs about G/Technology
There are several Intergraph employees busy writing blogs about how stuff works, where the product is going, discussing things they have seen and so on, thus creating a great community.
You have a forum where you can ask questions
There is a public accessible forum for G/Technology. If you google for GTech problems, you often end up in this forum. Intergraph employees are dedicated to answering questions posted in this forum.
All these items would be great to have and would really speed up development. Reallity though is, that you don’t have them and if you do not exactly know how something is done in G/Technology, you indeed are doomed.
Transferring G/Technology databases for development purposes
If you are busy with a team developing all kinds of GTechnology enhancements, you will find yourself most of the time in Oracle, configuring metadata, creating PL/SQL code or otherwise. When all developers are sharing the same database, it is very likely that things get messed up and I therefore always like to have a local database to do my development. I do not need all the data in it, but only the structure of the database and the contents of the metadata-tables. Once this is set up, I can create my scripts completely independent of the real database without disturbing collegues. If I feel like being ready, I can even drop my database and recreate it to see if my scripts are really doing what they are supposed to. It is only after that that I run my scripts against the real database.
Here is how to setup such an environment.
1 : Export the GTECHDIAG-schema
expdp gtsys/***@SERVER DUMPFILE=gtechdiag.dmp LOGFILE=gtechdiag.log SCHEMAS=GTECHDIAG
You need to replace SERVER with your own server- tnsnames entry.
Since the GTECHDIAG-schema is locked, you need to export it using a different account. In this case, the GTech-installation schema GTSYS is used.
2 : Export the GTech schema without data
expdp gtsys/****@SERVER DUMPFILE=gt_struct.dmp CONTENT=METADATA_ONLY LOGFILE=gt_struct.log SCHEMAS=gtsys
In this step the whole GTSYS-schema is exported without any data in it. This export also includes all the packages which was not possible using the oracle exp-command.
3 : Export the GTech schema with metadata
expdp gtsys/****@SERVER parfile=g3e.par
The parameter file g3e.par includes legend-tables unique to my GTech-installation. You should replace them with the legend-tables used in your installation. You can retrieve them with the following statement :
select g3e_definitiontable, g3e_settingstable, g3e_displaycontroltable from g3e_legend;
4 : Create local database schemas gtechdiag and gtsys and grant neccesary rights
Use regular oracle statements to add a schema with the neccesary rights. The schema name GTECHDIAG is the same in each GTech installation, user GTSYS is the schema used for your GTech installation. You can find an example over here.
5 : Import the gtechdiag schema
impdp gtechdiag/****@XE DUMPFILE=gtechdiag.dmp LOGFILE=gtechdiag.log
6 : Import the GTech schema without data
impdp gtsys/****@XE DUMPFILE=gt_struct.dump LOGFILE=gt_struct.log
The purpose of this is to create the whole structure of your GTECH-schema without all the facility data in it. If you need facility data in your local environment, you can later copy some rows via a database link or simply add some features.
7 : Import the GTech schema with metadata
impdp gtsys/****@XE dumpfile=gt-metadata.dmp logfile=gt-metadata.log TABLE_EXISTS_ACTION=REPLACE full=y
In this step you import the data you need for development, the metadata tables including legend tables unique to your GTech-installation. The filename in the dumpfile argument should match the filename specified in the oracle parameter file used in step 3.
8 : Recompile all invalid objects in the GTECHDIAG-schema
The objects in the GTECHDIAG-schema rely on objects in the GTSYS-schema which was not present during the import, so there will be invalid objects. You need to recompile them.
9 : Recompile all invalid objects in the GTSYSG-schema
Ojects in the GTSYS-schema rely on objects in the GTECHDIAG-schema which were invalid at the time of import. You need to recompile them after recompiling invalid objects in the GTECHDIAG-schema. Once every object is fine, you should be able to publish metadata.
10 : Publish metadata
If you can publish metadata, then you have succeeded.
Notes :
- These statements assume the GTech-schema is contained in gtsys, your development database can be accessed via the ‘SERVER’ tnsnames entry, and you have setup a local Oracle 11 database called XE
- You only need to transfer the GTECHDIAG-schema once since this normally doesn’t change
- Changing the initial extend of tables : If you are transferring a database with many tables, you may run out of diskspace because the import process is reserving an intial extent for each table. I recently needed to import a database with more then 3000 tables, needing much more then 20G just for importing the structure. To avoid this problem, invoke impdp with the ‘TRANSFORM=segment_attributes:n’ flag. Using this flag you can tell the import-datapump to change or ignore size.
- Changing tablespace specifications:The source database may have other tablespaces then the destination database. You can remap tablespaces using the ‘REMAP_TABLESPACE’ flag
- If you need to import the gtech-schema again, you can do so by dropping the local ‘gtsys’-schema and repeating steps 6 – 10.
An example for changing the initial extend and remapping tablespaces :
$impdp VMESDEV/*****@ORCL DUMPFILE=VMESDEV.DMP LOGFILE=VMESDEV.DMP.log TRANSFORM=segment_attributes:n REMAP_TABLESPACE=GC_DATA_FIBRE2:VM, GC_DATA_MISC2:VM
This command will map tablespaces ‘GC_DATA_FIBRE’ and ‘GC_DATA_MISC2′ to tablespace VM’.
Validating a ConnectionConfigurationMap.ccm file : ccmReader
When setting up a G/Technology environment, one of the boring things always is getting your ConnectionConfigurationMap.ccm right. One way to see what ddc-file is causing you trouble is use sysinternals Process Monitor tool and see what the last ddc-file read by the G3E-executable was. The next file immediately following that file in your ConnectionConfigurationMap.ccm then is the cause of your problem. This approach works but is very boring and can take a lot of time. Therefore, I created a tool called ‘ccmReader’ which stands for ConnectionConfigurationMap Reader and its soal purpose is to check the validness of a ConnectionConfigurationMap.ccm file.
CcmReader’s usage is as follows :
ccmReader [-v] [-l] [-c CONF] [-dumplang] [-showlang] [-help] ConnectionConfigurationMap.ccm
So what does it do ? Let’s say you have the following ConnectionConfigurationMap.ccm section and the F152-DetailVerwijzing.ddc file is missing:
MAPFILE=d:MAPFILESGEOSDDC9.4F148-Oov.ddc;
MAPFILE=d:MAPFILESGEOSDDC9.4F159-Waterkruising.ddc;
MAPFILE=d:MAPFILESGEOSDDC9.4F152-DetailVerwijzing.ddc;
MAPFILE=d:MAPFILESGEOSDDC10.1F252-InfraLand.DDC;
MAPFILE=d:MAPFILESGEOSDDC10.1F252-InfraLand_1.DDC;
CcmReader then would output the following :
The output tells you that there are two files which should be present according to the ConnectionConfigurationMap.ccm file but could not be read : d:MAPFILESGEOSDDC9.4F152-DetailVerwijzing.ddc
and d:MAPFILESGEOSDDC9.4F152-DetailVerwijzing.cdt
. So without even trying to start G/Designer, Netviewer or G/Netplot you know it will not work since your configuration is wrong and you need to fix it. Once you have corrected the errors and each file can be found, the tool will not output anything unless you used the verbose (-v) flag :
I use this tool very often in my day-to-day work, and it comes in very handy.
You can download the executable here.
Hope this helps, Stephan
G/Technology & Microsoft security patch July 2012
Installing Microsoft Security Patches for july 2012 on a machine where G/Technology Netviewer server is installed may lead to problems. This has been confirmed by Intergraph at the Intergraph Customer support website:
Support Notice
Microsoft Security Patches for July 2012
July 19, 2012
Based on Intergraph SG&I’s Quality Assurance testing of the Microsoft Security Patches for July 2012 we have identified some issues when running the G/Technology suite of products in conjunction with these patches. At this time we don’t recommend G/Technology customers installing the July 2012 Microsoft Security Patches. We will update the Support website with additional details as soon as this problem is resolved.
For more information about Microsoft’s July 2012 updates, refer to the following website:http://technet.microsoft.com/en-us/security/bulletin/ms12-jul
If you have any questions, please contact the Intergraph – SG&I Help Desk at 1-877-822-8921 for assistance.
The G/Technology developer experience
The G/Technology runtime environment is great and I haven’t seen much issues with it working at KPN Netherlands from 2009 until recently. A whole different story is the G/Technology developer experience and I’m very frustrated about it :
There’s no clean distinction between development and runtime environment
In G/Technology your feature definitions are stored in the database and made available for clients through a process called ‘metadata publishing’. This process calls some procedures and packages in the database to gather all the information needed, store it in temporary tables called ‘optimized tables’and then writes the content of these tables to a folder as .raw files:
This folder then is read runtime and much of the information in the database is not needed anymore. All the tables starting with G3E.., much of the Oracle packages and procedures are only needed at development time, not runtime. In my view this is a clear polution of the database and mixing of concerns. A better way would be to make a clear distinction between a development-scheme and production-scheme and maybe take of all the generating-procedures from the database and store them somewhere else. Visual Studio T4 templates would be an ideal candidate to replace the code generating packages in the database.
It takes to much time to test and debug your features
Back in the days of FRAMME, you had a great tool called ‘Framme Knowledge Based Tools Rulebase Builder’ or FRMKTRB/FRMKTNUC as I recall its product name was, and it essentially was a feature definition generating tool (a rulebase is what you know would call metadata). It would allow you to generate, compile and test your feature definitions fast, easy and in a very user friendly way. The only downside was that it was not multiuser-enabled and it had its own datastore to store definitions. The options you have now for developing your definitions are either one of these :
- The G/Technology project tool
- Metadata explorer from within G/Technology
- Use SQL-scripts