The future is cloud-y for engineering data management

Feb 2015

Lately I have been deluged with the announcement of or introduction to a series of cloud based data management systems for design engineering that are also focusing on collaboration. I plan this blog to be the first in a series that explores new PDM/PLM (PxM) solutions for product design.

Before I begin, we need to clarify the differences between PDM and PLM. PDM manages design changes during product development while PLM manages engineering and other changes made after the production release of the product for manufacturing and other downstream processes. Using this definition, PDM can be used to store all sorts of information during the design or work-in-process stage. Such information might include, but not be limited to: product specs, preliminary designs, analyses and simulation, product versions, QC specs, engineering BOMs, material types, etc. PLM manages engineering and other changes made after the release of the product from engineering. PLM systems might include PDM data managed during design as well as other data, such as, manufacturing BOMs, manufacturing instructions, NC data, service tracking, cost data, customer level documentation, etc. I think you get the picture.

PTC’s recent announcement of PTC PLM Cloud, a webinar I attended about GrabCAD Workbench and Onshape’s inherent use of a cloud-based solution — all piqued my interest. I began wondering about the differences between them and how one might choose a solution for a mid sized firm. One obvious differentiator is how cloud based PxM software connects to CAD software, be it desktop CAD or cloud based CAD. By the way, if you have not seen Onshape’s Dave Corcoran’s blog about the “The blue screen of death,” then I urge you to read it now. Corcoran discusses some of the benefits of a cloud based PxM – CAD implementation.

A true cloud based system allows full use of easily extensible computational capability and virtually unlimited storage

A PxM system cloud based system may not be much different from the tired old server based software that has been promoted for years. Adding a web based interface and hierarchical data storage in the cloud, masks an antiquated architecture. The old approach of bolting external data management software into CAD simply does not work well enough. It’s too laborious, takes extra time, and makes little use of design info developed automatically during the design cycle. It’s lack of adoption to date verifies this assumption.

A true cloud based system should be radically different in architecture allowing full use of cloud system flexibility. For instance, one reason I always disliked the previous generation of PDM/PLM was their outdated reliance on text-based interfaces. I would expect modern PxM systems to be graphically oriented offering comprehensible and visual navigation within the product structure. It should offer a tight connection to related CAD systems and automate much of the data management function. Automatic backup and easy restore of historical data are mandatory functions, as are easily distributed design among partners along with IP (intellectual property) protection.

The vendors are all moving quickly to position (or re-position) their PxM systems as cloud based

The plethora of cloud based data management systems for engineering and CAD include the following (plus some I haven’t yet discovered): Autodesk PLM 360, Onshape, GrabCAD Workbench, PTC PLM Cloud, and Kenesto as well as Dropbox and related cloud drive systems. More traditional software is offered by ARAS, Dassault Systemes and Siemens PLM software. What follows is a summary of how some of these vendors are positioning their software.

  • Onshape promotes distributed design. Using cloud based CAD along with a fully integrated cloud PDM system allows a brand new perspective on how modern CAD systems should work. Essentially all costs for compute power and data storage are greatly minimized, easily increased, even ”borrowed” for a short duration.
  • GrabCAD’s Workbench calls itself “The fast, easy way to manage and share CAD files without PDM’s cost and hassle.” The company goes on to state “Workbench allows teams on any CAD system to work smoothly together by syncing local CAD files to cloud projects, tracking versions and locking files to prevent conflicts.” The enterprise version costs $89 per month.
  • PTC recently announced PTC PLM Cloud, stating “this solution leverages the power of PTC Windchill, while simplifying PLM adoption with a flexible, hosted subscription offering, deployable at a pace that matches the needs of SMBs.” I am not exactly sure what this means, but expect to clarify this when I speak with PTC this week.
  • Very soon, Kenesto plans to announce a cloud based system that, Steve Bodnar – VP of Strategy, calls a terrific solution for small shops, enabling them to replace their server based, in-house error prone, file based systems with a much higher function cloud based system that requires minimal change to the way CAD users work, yet improves the reliability of their data management.

Alas, how can an engineering organization differentiate which PxM technology to buy and invest their time and money in? More detail about various implementations and my assessment of them will be forthcoming in future blogs.


Ray Kurland — I have returned to consulting and analyzing systems for TechniCom, from an early and erstwhile retirement.


Cloud PLM Systems ease collaboration

While there have been several articles negatively discussing using cloud software for CAD, users should be aware that for the PLM aspects of collaboration a cloud based system is by far the best way to go. Okay, there are a few cons to using a cloud system for access, such as concerns over security and potential downtime over which users have no control. Security may be a major concern for government projects requiring super high levels of secure data requirements. For 98% of users this should not be a concern. That does not mean that you should blithely ignore what security your chosen cloud vendor provides – by all means make sure that your concerns are met. But today’s security and encryption seem more than adequate for most users, providing that it is properly executed and monitored. You might even want to consult with independent security experts prior to committing to a solution.

Nevertheless, there are a quite a few benefits that far exceed the other alternative — that of maintaining an internal server capability.

First I need to make an assumption that a typical user installation has the following situation:

  • More than one engineering facility at which design is done
  • Multiple suppliers that need some type of restricted access to the design data


Assuming this is the case (and I’ll bet that more than 80% of users fit in this category), then here are only a few advantages cloud based PLM software accrues:

  • Little or no IT required for installation, setup, updates to the software, or backups.
  • A single copy of the database that does NOT require synchronization among multiple servers.
  • Easy management by database administrators
  • Lower software costs??
  • No personnel and space costs for servers or multiple servers
  • Ready internet access via various speed connections worldwide
  • No special costs for high speed telecomm connections


I can think of only two PLM systems that are completely architected for cloud operations: Arena Solutions and Autodesk 360.

My Twitter feed at PlanetPTC Live 2012 expanded with additional comments

7 Jun 2012


I attended PlanetPTC Live 2012 as a media and analyst guest of PTC earlier this week. I was free to mingle with any user in attendance, and attend the general sessions, as were the other 75 or so media representatives. PTC also organized special sessions for the media. These sessions generally were more concise and allowed more direct interaction with PTC executives, other management and selected presenters. [Disclosure: PTC paid for my airfare and hotel accommodations.]

I tweeted during the events I attended, not prolifically as do some other tweeters, instead choosing to focus on what I found to be interesting and the highlights of some sessions. I have taken most these tweets and expanded on them below for my blog readers. In a blog to be posted soon, I might add additional comments.

In general, the conference was upbeat and well organized. With Creo and Windchill almost evenly divided in terms of revenue, the two lines of business account for some 80% of PTC revenue. The other three (ALM, SCM, and SLM) make up the balance, but represent substantial future growth areas for PTC. All three are collaborative businesses based on Windchill. SLM being the newest. With the PTC business now focused on lines of business, each with its own P&L, customers are better represented.

Tweets expanded (tweets are identified by the • symbol, followed by an expanded explanation)

  • In the exec wrap up on Tuesday, Brian Shepherd confirmed plans for an entry level Windchill. Pre-configured for smaller users.

More: While I had not heard of such an activity, some media were and asked the status of the project. As best I can recollect, this may come out in 2013. Probably one reason why Windchill ProductPoint was decommissioned last year. Remember this product, which relied on Microsoft SharePoint?

  • PTC realigns organization structure by lines of business, each with P&L responsibility. CAD, PLM, ALM, SCM, and SLM.
  • SLM is service lifecycle management. According to EVP Barry Cohen, an underserved market.
  • Mike Campbell now heading up MCAD segment. Brian Shepherd and Bill Berutti head up other 4. Development reports to EVP Rob Gremley.

More: Here are the relevant descriptions from the latest PTC company info flyer:

Rob Gremley EVP, Product Development & Corporate Marketing
Brian Shepherd EVP, PLM & SCM Segments
Bill Berutti EVP, ALM & SLM Segments
Mike Campbell Division General Manager, MCAD Segment

  • Problems reconciling EBOMs and MBOMs? Now there’s another – SBOMs. Service BOMs add parts kitting.

More: Users have struggled with developing and managing manufacturing BOMs for decades. Add a new one for managing the services practices – the Service BOM, which describes the product from a service point of view. These often contain groups of parts that may be replaced as one unit in the field.

It looks like Windchill MPMLink today manages this process for MBOMs and EBOMs in those companies that use Windchill and Creo. With PTC constructing a Service Lifecycle Management business unit, I am not sure where or how the SBOM relates to the other BOMs and how it is managed. I am sure PTC has thought this out and can provide an answer.

  • Campbell highlights Creo Layout and Freestyle as providing impetus for move to Creo.

More: These two Creo apps are new for Creo 2. Both are targeted towards giving users more easy to use modeling methods, fully integrated with Creo Parametrics. In the case of these two apps, both also play in the concept design space. PTC stressed the connection into Creo, rather that having a stand-alone concept design system, a dig I am sure meant to rattle the cage of companies using Alias (from Autodesk), today’s most widely application for industrial and concept design.

  •  PTC positions Creo 2 as opening the floodgates for Wildfire transitions. No cost to users. UI and functions better.

More: Brian Shepherd said this on the first day in the main tent session. For those of you not aware of what the term main tent is, it relates back to my days at IBM, where they called the main tent was where all the attendees were gathered together, as opposed to the breakout sessions. I guess back in the early days IBM held these sessions under tents – companies were smaller then.

  •  With release of Creo 2, PTC encouraging third parties to develop [apps]. None available from third parties yet. Opportunity to fully integrate.

More: In a follow up conversation with Brian Thompson, VP of Product Management for Creo, he stated that the requisite API’s are not fully available yet. They will be by Creo 3 and Creo 4. Creo 4, I asked! Yes, he said by Creo 4, or two years from now. Third party developers might want to clarify this directly with PTC.

  • Option modeling another approach to developing ETO configurations. Another approach to developing requirements based models?
  •  Option modeling marries Creo 2 and Windchill 10.1. Can add PLM config options based on geometric positioning.

More: Option modeling allows a concise description of a product with many variants. In some systems users plug all the variants into a parametric model containing all of the variant options. This often results in a very large model with an obscure definition of when each variant is used. Creo 2 and Windchill aim to solve this by combining the geometric properties of Creo with the data management properties of Windchill. For example, in a bicycle, all wheels are attached to hubs. Thus one need only keep track of the different wheels, along with any geometric modifications to the geometric model for the various wheels. Filters and equations are used for the definitions. I think, because I only saw a five minute video example.

  • Attending Cummins case study of integrating mfg and product intros. Closing the loop between the two.

More: Dr. Michael Grieves, author of several books on PLM, along with Apriso, revealed a startling efficiency claim for Cummins, which integrated its PLM, ERP, and MES systems. See if you can get a copy of his slides for an explanation.

  • Main tent sessions focused on Creo 2.0 and hints of what’s to come. Main business line highlighted. Campbell: great job on CAD.

More: On the first day PTC revealed what’s new with upcoming products and it vision for the future, near term.

  • Chief customer officer – Mark Hodges. Never heard of that title.

More: From Wikipedia I found out that a chief customer officer (CCO) is defined as “an executive who provides the comprehensive and authoritative view of the customer and creates corporate and customer strategy at the highest levels of the company to maximize customer acquisition, retention, and profitability.” The CCO typically reports to the chief executive officer, and is potentially a member of the board of directors.

  • High of 97 degs expected today at PlanetPTC in Orlando. Hot AND humid. Good to be inside with A/C all day.

More: Guess someone got a good discount for holding it here this time of the year.


Siemens PLM Software’s Active Workspace mines product data

18 April 2012: For a long time I have been less than an enthusiastic advocate of PLM based systems as a vehicle for managing development processes. Sure, the storage of data is an important way of gathering product development data. Yet, viewing and using the associated data was always difficult. I found that, coming from a CAD background, which provides glorious views of 3D products, scrolling through page after page of data tables quickly becomes mind-numbingly tedious. Often one needs to view different datasets in different ways to attain a glimmer of data needed for decision-making.

Siemens announcement recently of Active Workspace (AWS) for their HD-PLM environment is their latest, and best attempt to present data graphically to the user, making the system perform the work of visually integrating the vast amount of integrated product data.

Last Fall, while attending a Siemens analyst conference, the company rolled out some preliminary information about Active Workspace. I was excited then about the long-term possibilities of revising the ways users can extract and make use of vast amounts of data. This announcement provides a very useful beginning for this project.

Exactly what is HD PLM? It’s not a product, but an architectural framework. HD-PLM, announced two years ago, provides a technology foundation enabling Siemens product development team to produce a common set of integrated software tools that will identify, capture and collate the massive amount of information available in manufacturing enterprises, and apply meaning to that data using an intuitive visual environment.

Two weeks ago, on 3 April 2012 Siemens announced Active Workspace Version 1, the first product to achieve the beginning of that vision. Grindstaff noted that “Active Workspace creates an intuitive and personalized 3D graphic interface that significantly enhances the ability of our PLM suite of offerings to deliver knowledge instantly to the right people, at the right place and in the right context to support rapid and intelligent decision making.”

To find out more I sought out some details from Siemens and had a conference call with two product managers for AWS: Bill Lewis and John Whetstone. They described AWS as having the following capabilities:

  • Find information fast
  • Visualize and navigate
  • Compare and report
  • Collaborate
  • Configure and share control

Lewis described AWS as a tool to help the vision of semantic data understanding enabling users to make smarter decisions using HD PLM. Indeed products are getting more and more complex. HD PLM is looking to solve this. He sees this as a tool for all PLM users; not just professional users, but casual users as well.

The slide below shows the products supported by AWS.

Products supported by Active Workspace V1

An example of AWS in action

Whetstone performed a live demo for a sample company. We start by searching for all objects in the database for the company. The search yields the following 130581 results:

Searching for all objects

Data was taken from Teamcenter and indexed to achieve the speedy result, which took only a few seconds. Note the object filter types at the top of the screen.

Selecting the type – Physical Design Model Elements – resulted in this:

Revealing Physical Design Model Objects

These are the physical mechanical model elements of the company, each designated with a different part number.

Shown below are the types of objects from which we can choose as a high level selection. This data is already in the Teamcenter data which we extract. The data extraction methods and rates are user determined.

The types of object filters offer ready access

Drilling down to the hard drive we get only 61 objects:

View of model elements in one sub-assembly

This is the tile view, showing access to other data associated with each object, such as: revision, owning user, type, and other data. Along the right side are icons that can launch applications, such as shape search and visual navigator, where used, and more info.

Visualizing the top level hard drive and using the JT object formats, here is the result:

Visual navigator display for the hard drive sub-assembly

Note the pan, zoom, and rotate options at the top of the image above. Drilling down to the drive heads shows this:

Displaying a sub-assembly within the hard drive

The “more info” icon reveals attached documents such as FEA results.

Revealing additional information associated with the drive heads

AWS requires the user to have a license of Teamcenter (TC). It is a companion product to TC.

My impression:

This is a big deal for Siemens and their users. It basically allows data mining of related, or as Siemens refers to it – the semantics of the data. Semantics, or the problem of understanding, allows one to make sense of the miasma of data relations associated to products. What is the product, what functions does it do, where does it fit, what were the specifications and were they met, what tests were performed, and so on. This data is stored within Teamcenter. Making sense of the data stored in different databases is difficult.

AWS seems aimed at the largest users. Early adopters include GM, Ford, JPL, and Rolls Royce.

While I was turned on by the AWS capability, this release seems to have an awkward and incomplete UI. Also, the reporting and rollup capabilities need to be extended. For instance, cost rollups and product status do not seem to be available yet, but are on the drawing board. Searches are limited to single attributes.

AWS V1 is available now. Maintenance releases are scheduled for July 2012 and Nov 2012. While the next major release is not due until the end of 2013, there is plenty in this release to keep users busy.

Pricing seems modest. $750 per named user. But, for large installations this could add up quickly. I expect there are volume discounts.

As far as competition, only Dassault Systemes with their V6 Enovia 3D Live offering is even in the same ballpark.

More info can be found at

Autodesk 360 and Nexus – PLM 1.0: not perfect – but a great start

3 Dec 2011: Errata. I was incorrect in stating that Buzzsaw was a local PDM vault for AEC/BIM. Several people have written me about this, one being Stephen Bodnar of Autodesk. Bodnar stated that “Vault is the on-premise DM solution for both industries, whereas Buzzsaw is cloud-based and is also built on Autodesk’s Cloud, and is intended for design file collaboration between partners/suppliers and other users and does, in fact, have bi-directional push/synchronization with Vault)”

1 Dec 2011: I am on my way back from Las Vegas, where AU 2011 was held. The highlight of the event, at least for me, was the announcement of what I am calling Autodesk PLM 1.0. The announcement was not a well-kept secret, but the content of the announcement was closely held.

Monday’s media day preceded the conference. The actual PLM announcement came late Tuesday morning. Carl Bass retracted his oft quoted remark about PLM not being something customers worried about; instead, it was revised to mean “until the technology was right.” I couldn’t agree more with his reasoning. Most of Autodesk’s competitors PLM systems offer expensive, difficult to use, and almost impossible to install PLM systems, that rarely have met expectations. Even then, it is often at the cost of massive consulting assistance, rarely meeting anticipated timeframes, AND generally involves the implementation of substantially revised business processes.

Different than my analyst peers I have always been skeptical of such large and costly projects. Not being on the implementation side, I could afford to be skeptical. Many such projects, aside from basic PDM, seldom actually get implemented. Most stall. Autodesk estimates that most deliver only PDM. To test this thesis, I tweeted my followers and asked what they had accomplished. With just a few responses, this is hardly scientific. Several stated that did not yet have even PDM fully implemented!

So what was actually announced? The system is being called Autodesk 360. It is based on having locally installed PDM. For mechanical and for AEC this is Vault. Buzzsaw, a cloud based application provides design file collaboration for AEC teams. The third, and new software piece is called Nexus. The dictionary describes the word nexus as a “connector.,” and is a good description of what the software aims to do. In the following discussion I concentrate solely on mechanical PLM. For information on Buzzsaw and how it uses Nexus readers will have to go elsewhere. Try here.

Nexus is cloud based, and comes with 140 or apps. Each app looks like a series of specialized templates, along with customizable (by the user) workflow logic. Delivery is expected by the end of March 2012. No pricing was announced, however, the implications were that it would be modest. It will be sold on a per user subscription basis. All Nexus data and apps will be run in the cloud, using an ordinary browser. The mass of data will remain locally hosted using Vault. Having and maintaining Vault locally solves the issue of loading very large cloud based data while still maintaing some degree of interactivity.

How will it interface with Vault and other PDM systems? Very well with Vault. No connectors were announced to integrate with other PDM systems. Autodesk hinted that this is a good opportunity for third party developers and VARs. Connections with Nexus could be implemented via as yet unannounced APIs.

Today, the connection between Vault and Nexus is one way. CAD data cannot be sent from Nexus to Vault. Nor is it synchronized among Vaults, as is done among Apple’s iCloud apps. However, Vault data is automatically synced up to Nexus. Expect bi-directional sync in the future.

Is it easy to install and operate?

Keep in mind that my total exposure to Autodesk 360 Nexus comes from a 30 minute, main stage presentation, followed by a 60 minute working session where about 20 people per workstation watched a very capable Autodesk developer demo and responded to questions, often by showing us how Nexus would solve the proposed question.

Nexus appears to be an out of the box system. Nexus comes with predefined templates and workflows. Yet they can easily be added to and/or modified. Fields within templates (apps) can be defined on the fly and their characteristics (such as numeric, values, dates, etc.) as well. A Visio like graphic interface defines workflows. Many are offered in the starter system. A typical administration system allows assigning users to tasks and roles. Somehow, data fields can be interconnected, allowing visibility to see what drives or is driven by what.

So. There you have it. I imagine Autodesk will soon, if not already, have many seminars and pre-recorded AVI’s showing the software. Try here:

My conclusions

I think the product is outstanding. Being cloud based resolves many operating issues. Some users might question the security aspects of hosting much of the data remotely, and would do well to satisfy themselves that either this is not an issue, or otherwise. I think, that perhaps except for very special circumstances, the cloud-based security might even be vastly superior to what they could do locally. I think this is a non-issue.

Cost wise, I think this will prove to be much less expensive, long term, than most of today’s solutions. Again, this is a non-issue. Just take a look at the slide Stephen Bodnar of Autodesk, VP of Data Management, presented below that compares some costs for a 200 user deployment.

For collaboration, data can be uploaded, either in summary format, or detailed CAD files. Nexus has controls over what user sees what data.

Included are project management capabilities that allow rolling up from completed sub-tasks automatically. Defining projects involves defining sub-projects with easily configurable tasks and reporting procedures. If you have already implemented workflow as part of Vault, then is should be redone using Nexus. It allows more flexibility and better visibility.

If you want visibility by projects, by project managers and contributors, with flexibility to change workflows and processes to meet how you do business, it’s all there. My only question is how soon can I get it?

Ray with his skeptical face during AU2011 —-


Here are a few slides from the presentation to give you an idea of what Autodesk presented. Sorry for the quality – I used my phone.

The overall concept of Autodesk 360.

Stephen Bodnar discussing their view of PLM:

Why is it called 360? Showing how the Vault and Buzzsaw make up local PDM systems:

Brenda Discher discussing why users don’t like competitive PDM systems.

What Autodesk is doing about it with Nexus.

Inforbix $ errata, Autodesk Vault to the cloud

In my previous blog, I made an error on the pricing of Inforbix, which I have since corrected. I wanted to make sure you all have seen that correction. In the pricing example that was given, for a company of 100 persons, with 30 engineers, Vic Sanchez estimated that they might have 100K to 200K files to be indexed. The annual price for Inforbix for that size customer would be $10K to $15K. A great price range for the service provided. In fact one that is very compelling.

In the meantime it looks like Autodesk is planning to announce that their Vault will now be cloud hosted. I have no other details than some early teasers that were provided by Autodesk. It will be interesting to compare these offerings. I am planning on attending Autodesk University and will be there Monday through Wednesday, Nov 28-30. Say hello if you see me. I will report on this upon my return.

Inforbix – a new approach to cloud based PDM

18 Nov 2011: Oleg Shilovitsky, one of the more prolific bloggers in the PLM industry recently announced his new venture – Inforbix LLC.

Last week I had the chance to speak with Oleg, the CEO, and his partner, Vic Sanchez, about what their new offering was all about. Of course, I suspected that the new company, with Oleg’s background as a development manager of PLM systems, might be about PDM or PLM. Of course I was right. But, I wanted to find out what the product was all about, who founded the new company, what its objective was, a little bit about the technology, and who might use it and what it might cost.

Oleg and Vic were most accommodating in helping me understand and ferret out answers to the above questions.


Inforbix began development on its product in early 2010. The product was officially launched in last October, 2011, and has been in Beta since last April. Shilovitsky teamed with a Russian development team to bring the product to fruition.

About the product

In a nutshell, here is what I learned. Inforbix, today, consists of a product data crawler app that is installed onto the target system or local network containing the product data to be indexed. After user customization of the crawler app, which basically tells it where to find the data to be indexed, the app goes to work finding relevant product data, exploring the metadata stored within the data files, and indexing the data. No actual data files are uploaded to the cloud, only metadata and where the files are located. What makes this exciting is that the crawler can crawl through many data types and vaults, and decode the inherent metadata and product structure.

Targeted at small and medium sized CAD companies, the object of Inforbix is to “help people find, reuse, and share product data.”

Both the crawler app and the cloud based search environment are optimized for manufacturing and design companies. I like that non–vaulted data such as Word docs and pdfs can be “related” back the products.

The system today supports crawling CAD and PLM data from Autodesk, PTC, SolidWorks, and Siemens. More will be coming in the future. Also supported are pdf, Word, and Excel files.

A few niceties

It is secure since no files are changed, moved or uploaded. Being cloud based, little maintenance or local support is needed. It is affordable and seems to be priced right – the first 20K files are free. Each 20K files after that cost $600 per year. Sanchez estimated that a typical medium sized company with 100 people and 30 engineers might spend $10K to $15K per year, a seemingly small cost considering that no hardware and no support staff is needed for the service. Also, it immediately allows accessing the data worldwide using a browser. Asked about what happens if indexed data moves, Shilovitsky said that the crawler monitors and tracks the new location, and updates the cloud.

Inforbix offers many ways to present the data to make sense of the product connections. These include Excel like tables and filters.

I see a few drawbacks and improvements needed

The original data still needs to be maintained along with support and local data backups. A local PDM system might still be needed to support applications that depend upon understanding the product data structure. Further discussions are needed as to how the system allows role-based access to the data. For instance, how can suppliers access the data? Data being relocated might have a delay before the indexes are updated on the cloud.


I really like the concept and the possibilities for extending the concept to other areas of a company. It seems that it would be relatively straightforward to have different crawlers looking for different data types. Think of it as a private Google for the data in your entire company or how to get organized without the fuss. If you are a company without a PDM system (and some 75% of companies are), then this is a perfect way to get started.

Try it out

With a free entry price, it makes sense to give this a try.

A few ways to learn more

The company:

The latest press release:

Oleg shows how to start using Inforbix in 20 min: