Rhino 5 Beta Features a Gumball Manipulator

15 April 2012: last week I had a chance to sit in on a webinar hosted by Novedge about the new Gumball manipulator for Rhino 5. Why it’s call the gumball manipulator I have no idea and in response to this question apparently Rhino doesn’t either. Brian James, from Robert McNeel & Associates, presented the webinar hosted by Novedge.

Using the gumball manipulator, allowed for a very impressive list of capabilities to modify surfaces and curves directly.

Here is a view of the manipulator, selected to operate on the yellow curve.

Gumball Manipulator

It can perform translation, rotation, and scaling on the selected object. It can also be used to create geometry. This curve can be used to create the first solid, as shown below.

Scaling and translating the top face to modify a solid.

Eventually, using the gumball, other Rhino functions and a few other curves, the presenter created this faucet.

The faucet

The additional curves in the image below will be used to generate the sink shown below.

The final sink model

This was all pretty impressive and demonstrated that Rhino is continuing to develop their software into a “solid” CAD system featuring advanced curves, surfaces, and solids, as well as having a unique UI. All at a modest price.

Take a look for yourself via the recorded webinar.

More info:

Link to the recorded webinar:

http://www.rhinojungle.com/video/novedge-webinar-series-episode-43-rhino-5-overview-featuring-the

www.novedge.com, a leading on-line superstore has lots of video demos available for many products.

Scan and Solve offers meshing-less FEA

19 March 2012: Recently I had a chance to sit in on a demo of Scan&Solve™, software promising to (virtually) automatically solve parts for linear stress FEA analysis without any concern about meshing the part. Used in conjunction with solid models of parts developed with Rhino, the demo did just that. To verify the accuracy of the results, the demoer adjusted something called the resolution of the “geometry scan” of the part. Adjusting the resolution showed that the accuracy was converging. Wow! I thought. Time to find out more about how this worked, how extensible it was, how it differed from traditional FEA, and its cost. I went to the company website and soon located the founder of Intact Solutions LLC, the company that authored the software – Vadim Shapiro, Professor of Mechanical Engineering and Computer Sciences at the University of Wisconsin-Madison.

Right away, I figured uh oh, an academic. They are not always known for providing crisp answers. Nevertheless I requested and was granted an interview with Dr. Shapiro. He turned out to be very open and enthusiastic about the product.

Vadim Shapiro, Intact Solutions founder

Here is what Scan&Solve does differently than traditional FEA meshers and related solvers:

  • Scan&Solve is not a replacement for FEA; it is an extension of FEA, which aims specifically to solve the problem of CAD/CAE interoperability. Any reasonable geometric kernel and any reasonable FEA package can be interfaced with great benefits. The goals of the product are simplicity, universality, and complete automation.
  • The current version analyzes parts only, not assemblies.
  • Instead of meshing, the software assigns an analysis space (a grid) surrounding the part to be worked on, as shown below:

How Scan&Solve works

  • Scan&Solve needs to interface with the CAD system to supply coordinates of the model to Scan&Solve for each point on the grid. Given this interface between the CAD system and S&S, there is no need for a mesh to be created. Instead the software works with the precise model geometry. Scan&Solve directly modifies the basis functions, sometimes called “shape functions” — functions that approximate the solution of the problem. In the current implementation, these basis functions are associated not with vertices of the mesh, but with cells in the mesh (of the space, not of geometry). “Modify functions” means that they are modified to satisfy the applied restraints everywhere — not just at vertices. Scan&Solve™ can be applied to any geometric model and used within any geometric modeling system that supports two fundamental queries: point membership testing and distance to boundary computation.
  • No simplification or de-featuring of the model is needed.
  • Increasing the resolution of the grid can test convergence of the results. If a higher resolution produces large changes in the results, keep increasing the resolution. Shapiro noted, “The issue is essentially the same as with standard FEA. One can estimate the error and refine the mesh (or increase density in our case), but it is more or less the same for all techniques. We do not do anything automatically right now. We advise the users to run at different resolutions (which requires NO WORK from the user) and compare the results. If results are significantly different, increase the resolution. In principle, this can and will be automated in the future.”
  • Can work directly with polygonal models. Scan&Solve performs all analysis related computations on the native geometry (whether polygonal, NURBS, or other form of geometry). Shapiro stated that “This eliminates the need for preprocessing: no healing, smoothing, de-featuring, or meshing is needed. This drastically reduces preparation/set up time.” However, the commercial product in Rhino works only with NURBS solids.
  • It always produces results. Shapiro stated “The solution procedure is deterministic, does not use heuristics, and always produces a result. (In other words, failure means a bug in the code: not inability to handle some geometry.) The advantages of S&S are full automation, complete integration and interoperability. Use it at any stage of the design process: from concept creation to detailed geometry.”
  • Prices are very reasonable. Scan&Solve for Rhino commercial licenses are $695 for a node locked version and $1295 for a floating license. Academic, trial and rental licenses are also available. Scan&Solve for Rhino also requires a Rhino license.
  • Interfaces are available currently for a limited number of CAD systems. Scan&Solve can be applied to any geometric model and used within any geometric modeling system that supports two fundamental queries: point membership testing and distance to boundary computation.

References: http://www.intact-solutions.com/

http://www.scan-and-solve.com/

Disclosure: No remuneration of any kind was paid for this article.

Conclusion: Both CAD and FEA vendors should check out the possibility of offering this technology as an option for users. With trial copies available from both Rhino and Intact Solutions, users wanting to extend FEA analysis beyond the traditional analysis experts should consider the benefits and urge their CAD partners to investigate this alternative.

Does SIRI signify a common CAD UI for the future?

I think, after using it on my iPhone 4S for a few weeks that the answer is definitely YES. And, this is especially true for CAD applications. All CAD apps require a complex series of user interactions, usually performed in a rigid manner to proceed through the process. How nice would it be to just speak what you want done and actually have the computer do the dirty work of interpreting the commands?

Just think. No more menus needed. No more searching multiple menu windows for the precise command needed. No more focusing in on tiny graphics on commands. No need to worry about sequencing the command.

On my iPhone I can just say “make an appointment with Bob for tomorrow at 2:00.” not too different than “draw an infinite horizontal line tangent to circle A” when Siri needs more info it asks for it. In my query above it might say “do you mean Bob Albert, Bobby Jones, or Bob Smith?” How cool is that?

I suggest vendors immediately get busy finding smart speech recognition software.

What do you think?

——-

Autodesk 360 and Nexus – PLM 1.0: not perfect – but a great start

3 Dec 2011: Errata. I was incorrect in stating that Buzzsaw was a local PDM vault for AEC/BIM. Several people have written me about this, one being Stephen Bodnar of Autodesk. Bodnar stated that “Vault is the on-premise DM solution for both industries, whereas Buzzsaw is cloud-based and is also built on Autodesk’s Cloud, and is intended for design file collaboration between partners/suppliers and other users and does, in fact, have bi-directional push/synchronization with Vault)”

1 Dec 2011: I am on my way back from Las Vegas, where AU 2011 was held. The highlight of the event, at least for me, was the announcement of what I am calling Autodesk PLM 1.0. The announcement was not a well-kept secret, but the content of the announcement was closely held.

Monday’s media day preceded the conference. The actual PLM announcement came late Tuesday morning. Carl Bass retracted his oft quoted remark about PLM not being something customers worried about; instead, it was revised to mean “until the technology was right.” I couldn’t agree more with his reasoning. Most of Autodesk’s competitors PLM systems offer expensive, difficult to use, and almost impossible to install PLM systems, that rarely have met expectations. Even then, it is often at the cost of massive consulting assistance, rarely meeting anticipated timeframes, AND generally involves the implementation of substantially revised business processes.

Different than my analyst peers I have always been skeptical of such large and costly projects. Not being on the implementation side, I could afford to be skeptical. Many such projects, aside from basic PDM, seldom actually get implemented. Most stall. Autodesk estimates that most deliver only PDM. To test this thesis, I tweeted my followers and asked what they had accomplished. With just a few responses, this is hardly scientific. Several stated that did not yet have even PDM fully implemented!

So what was actually announced? The system is being called Autodesk 360. It is based on having locally installed PDM. For mechanical and for AEC this is Vault. Buzzsaw, a cloud based application provides design file collaboration for AEC teams. The third, and new software piece is called Nexus. The dictionary describes the word nexus as a “connector.,” and is a good description of what the software aims to do. In the following discussion I concentrate solely on mechanical PLM. For information on Buzzsaw and how it uses Nexus readers will have to go elsewhere. Try here.

Nexus is cloud based, and comes with 140 or apps. Each app looks like a series of specialized templates, along with customizable (by the user) workflow logic. Delivery is expected by the end of March 2012. No pricing was announced, however, the implications were that it would be modest. It will be sold on a per user subscription basis. All Nexus data and apps will be run in the cloud, using an ordinary browser. The mass of data will remain locally hosted using Vault. Having and maintaining Vault locally solves the issue of loading very large cloud based data while still maintaing some degree of interactivity.

How will it interface with Vault and other PDM systems? Very well with Vault. No connectors were announced to integrate with other PDM systems. Autodesk hinted that this is a good opportunity for third party developers and VARs. Connections with Nexus could be implemented via as yet unannounced APIs.

Today, the connection between Vault and Nexus is one way. CAD data cannot be sent from Nexus to Vault. Nor is it synchronized among Vaults, as is done among Apple’s iCloud apps. However, Vault data is automatically synced up to Nexus. Expect bi-directional sync in the future.

Is it easy to install and operate?

Keep in mind that my total exposure to Autodesk 360 Nexus comes from a 30 minute, main stage presentation, followed by a 60 minute working session where about 20 people per workstation watched a very capable Autodesk developer demo and responded to questions, often by showing us how Nexus would solve the proposed question.

Nexus appears to be an out of the box system. Nexus comes with predefined templates and workflows. Yet they can easily be added to and/or modified. Fields within templates (apps) can be defined on the fly and their characteristics (such as numeric, values, dates, etc.) as well. A Visio like graphic interface defines workflows. Many are offered in the starter system. A typical administration system allows assigning users to tasks and roles. Somehow, data fields can be interconnected, allowing visibility to see what drives or is driven by what.

So. There you have it. I imagine Autodesk will soon, if not already, have many seminars and pre-recorded AVI’s showing the software. Try here: http://usa.autodesk.com/360-lifecycle-management-software/

My conclusions

I think the product is outstanding. Being cloud based resolves many operating issues. Some users might question the security aspects of hosting much of the data remotely, and would do well to satisfy themselves that either this is not an issue, or otherwise. I think, that perhaps except for very special circumstances, the cloud-based security might even be vastly superior to what they could do locally. I think this is a non-issue.

Cost wise, I think this will prove to be much less expensive, long term, than most of today’s solutions. Again, this is a non-issue. Just take a look at the slide Stephen Bodnar of Autodesk, VP of Data Management, presented below that compares some costs for a 200 user deployment.

For collaboration, data can be uploaded, either in summary format, or detailed CAD files. Nexus has controls over what user sees what data.

Included are project management capabilities that allow rolling up from completed sub-tasks automatically. Defining projects involves defining sub-projects with easily configurable tasks and reporting procedures. If you have already implemented workflow as part of Vault, then is should be redone using Nexus. It allows more flexibility and better visibility.

If you want visibility by projects, by project managers and contributors, with flexibility to change workflows and processes to meet how you do business, it’s all there. My only question is how soon can I get it?

Ray with his skeptical face during AU2011 —-

—-

Here are a few slides from the presentation to give you an idea of what Autodesk presented. Sorry for the quality – I used my phone.

The overall concept of Autodesk 360.

Stephen Bodnar discussing their view of PLM:

Why is it called 360? Showing how the Vault and Buzzsaw make up local PDM systems:

Brenda Discher discussing why users don’t like competitive PDM systems.

What Autodesk is doing about it with Nexus.

Siemens NX CAE Symposium: Users show their love

27 Nov 2011: The week before last I attended an invitation only event in Charlotte, NC, as a guest of Siemens at their first NX CAE Symposium. Designed as a way for users to get together to exchange ideas about how they use NX CAE software, some 80 customers attended the symposium, held at the Joe Gibbs Racing Facility just outside Charlotte.

The overall consensus of the presenters and the attendees I spoke with was satisfaction with the NX CAE suite. Many complimented the breadth of the CAE software, some of which I summarize below. Overall users were most satisfied because of the inherent associativity of CAE models with design models.

Several users told stories about how, in the past, they were asked by the design team to evaluate designs and get back to them. Even with an integrated system, the CAE analysts often spend substantial amounts of time simplifying models, insuring that the mesh is adequate for an accurate design, performing a series of analyses, and making recommendations to the design team, only to find that the design team has moved way beyond the design they were working on. Thus their work had to be scrapped. NX’s CAE and design integrations allow analysts to work on the design model, thus having a better ability to stay synchronized with the design team.

Also, NX seems to play well with external solvers, often integrating them tightly into the design stream workflow. Among these were Ansys solvers as well as specialized fluids solvers, such as those from MAYA.

My reactions:
Siemens PLM Software has a well-focused and wide breadth of solutions for heavy-duty CAE experts. Jon Heidorn, Siemens PLM Software (SPLMS) Vice President welcomed the attendees, stressing that simulation is one of their fastest growing markets, encompassing integrated modeling and solutions, system level modeling, multi-discipline simulation and optimization, and the intensely complex simulation data and process management. Beyond 2010 Heidorn predicted software would be available that would perform topology optimization. SPLMS also announced that their partnership with Joe Gibbs Racing was extended to 2016.

Mark Bringle and Nelson Cosgrove of Joe Gibbs Racing discussed their facility and their focus on engineering. Building their cars from scratch, and their engines almost from scratch, but carefully following NASCAR rules for each car, provides an impetus to carefully hone each major subsystem for optimal performance. Fascinatingly, their design cycle during racing season is one week! The three main groups include chassis and vehicle dynamics, aerodynamics, and powertrain. The latest version of NX allows for full chassis FEA modeling. With NASCAR demanding similar car frames and engine performance, their engineers carefully analyze every part to improve weight and aero performance so they can achieve even small advantages over their competition.

Jim Rusk of Siemens PLM Software discussed the latest trends in product development with NX CAE Simulation. He highlighted a few concepts they are working on sand delivering to make it easier than ever. Among these are Synchronous Technology for the CA analyst which makes for easier simplification, workflows for the advanced analyst, continuing improvements in multi-discipline analysis, motion analysis for flexible bodies like springs, multi-solver support, topology optimizations, and HD3D requirements management and validation.

ATK Aerospace, MDA of Canada, and JPL, Proctor and Gamble, and Solar Technologies spoke about their analyses ranging from rocket design to cryo engineering of spacecraft to making 1 million paper diapers to designing complex solar collectors.

Hendrick Motorsports’, Charles Macdonald, discussed detailed part analysis and the tradeoffs they make for lighter, yet strong and most of all highly serviceable parts of a suspension.

Kendra Short, of JPL and the mechanical manager of the Mars Science Laboratory (MSL), successfully launched just two days ago, spoke eloquently about how having a sophisticated analysis system working directly on the design model enable them to perform many more complex analyses than would have been possible without simulation done directly on the design models. Without the ability to service the MSL (it’s a long trip to Mars), Ms. Short chatted about the enormous planning that goes into having multiple alternatives in the event of a failure. I found fascinating during a break discussion about how the MSL is to be deployed to the surface using a tether. No backup here, just reliable explosive bolts.

One of the symposiums objectives was to have users exchange ideas about how they use simulation. This seemed to be more than fulfilled. If you have a chance to attend the next symposium, don’t miss it.

Disclosure: Siemens paid for my travel expenses to attend the event.

Autodesk Takes Simulation Mobile with New ForceEffect App for iPad

If you have not yet had a chance to see how Autodesk ForceEffect works, visit http://www.youtube.com/playlist?list=PL4F9264A84AD2085B for a series of videos on how this 2D force simulation app works.

Autodesk ForceEffect, a new mobile simulation app for iPad allows engineers to quickly and easily simulate design options during the conceptual phase, and is now available on the App store. Autodesk, as it has done with other iPad apps, offers Autodesk ForceEffect for free.

ForceEffect provides an easy to use environment for drawing, constraining and simulating concepts using free body diagrams by tapping objects to select, move, rotate and scale. Real time solving capabilities provide immediate feedback on the static stress performance of a design, enabling users to use engineering analysis in the field.

Users can send the geometry as DXF files, via email, for further analysis.

It’s not quite clear how or whether Autodesk plans to generate revenue from these free apps, yet their thinking is way out in front of their competitors in exploring new ways to use mobile computing and simultaneously explore potential uses of cloud technology. It’s refreshing that the company is forging ahead, exploring new ways of delivering software and testing the waters for new paradigms, both in software and pricing models.

Inforbix – a new approach to cloud based PDM

18 Nov 2011: Oleg Shilovitsky, one of the more prolific bloggers in the PLM industry recently announced his new venture – Inforbix LLC.

Last week I had the chance to speak with Oleg, the CEO, and his partner, Vic Sanchez, about what their new offering was all about. Of course, I suspected that the new company, with Oleg’s background as a development manager of PLM systems, might be about PDM or PLM. Of course I was right. But, I wanted to find out what the product was all about, who founded the new company, what its objective was, a little bit about the technology, and who might use it and what it might cost.

Oleg and Vic were most accommodating in helping me understand and ferret out answers to the above questions.

Background

Inforbix began development on its product in early 2010. The product was officially launched in last October, 2011, and has been in Beta since last April. Shilovitsky teamed with a Russian development team to bring the product to fruition.

About the product

In a nutshell, here is what I learned. Inforbix, today, consists of a product data crawler app that is installed onto the target system or local network containing the product data to be indexed. After user customization of the crawler app, which basically tells it where to find the data to be indexed, the app goes to work finding relevant product data, exploring the metadata stored within the data files, and indexing the data. No actual data files are uploaded to the cloud, only metadata and where the files are located. What makes this exciting is that the crawler can crawl through many data types and vaults, and decode the inherent metadata and product structure.

Targeted at small and medium sized CAD companies, the object of Inforbix is to “help people find, reuse, and share product data.”

Both the crawler app and the cloud based search environment are optimized for manufacturing and design companies. I like that non–vaulted data such as Word docs and pdfs can be “related” back the products.

The system today supports crawling CAD and PLM data from Autodesk, PTC, SolidWorks, and Siemens. More will be coming in the future. Also supported are pdf, Word, and Excel files.

A few niceties

It is secure since no files are changed, moved or uploaded. Being cloud based, little maintenance or local support is needed. It is affordable and seems to be priced right – the first 20K files are free. Each 20K files after that cost $600 per year. Sanchez estimated that a typical medium sized company with 100 people and 30 engineers might spend $10K to $15K per year, a seemingly small cost considering that no hardware and no support staff is needed for the service. Also, it immediately allows accessing the data worldwide using a browser. Asked about what happens if indexed data moves, Shilovitsky said that the crawler monitors and tracks the new location, and updates the cloud.

Inforbix offers many ways to present the data to make sense of the product connections. These include Excel like tables and filters.

I see a few drawbacks and improvements needed

The original data still needs to be maintained along with support and local data backups. A local PDM system might still be needed to support applications that depend upon understanding the product data structure. Further discussions are needed as to how the system allows role-based access to the data. For instance, how can suppliers access the data? Data being relocated might have a delay before the indexes are updated on the cloud.

Conclusions

I really like the concept and the possibilities for extending the concept to other areas of a company. It seems that it would be relatively straightforward to have different crawlers looking for different data types. Think of it as a private Google for the data in your entire company or how to get organized without the fuss. If you are a company without a PDM system (and some 75% of companies are), then this is a perfect way to get started.

Try it out

With a free entry price, it makes sense to give this a try.

A few ways to learn more

The company: www.inforbix.com

The latest press release: http://www.inforbix.com/inforbix-launch-press-release/

Oleg shows how to start using Inforbix in 20 min: http://www.inforbix.com/how-to-start-using-inforbix-in-20-min/

The Cloud Lives!

18 Nov 2011: Ralph Grabowski proposed his opinion that the cloud is dead. He couldn’t be more wrong. Consider users at the Siemens NX CAE Symposium that ended last week. Virtually all of the eight users at a panel noted that cloud computing would definitely be part of their plans. Assuming that some minor issues such as security, cost, and application software licensing could be solved, all seem to have or want it in their future plans.

Several customers represented companies that already have with HPC clusters. While this ideal “local cloud” met their expectations, the cost of such a cluster is very high and not a solution for smaller companies.

I agree that the use of cloud computing for interactive applications is a bad idea. However, the vast computing power, parallel processing, and expected low costs make it a very appealing idea for tasks that require modest bandwidth and have high computational needs. Autodesk’s CEO, Carl Bass, clearly has the right idea. Autodesk, over the past two years has introduced several applications that span the range of interactive hardware and relying on the cloud to ramp up compute speeds. At AU last year I had the chance to listen to Bass and speak with him about his ideas for best utilizing the cloud. As I wrote in that article, Autodesk’s concept is to “Don’t replicate desktop solutions on the cloud. Instead make maximum use of desktop and mobile systems, utilizing the cloud where it makes sense.” Still makes sense today. Here is a link to that article http://wp.me/pvn8U-3e.

Oddly enough, with the possible exception of DS, Autodesk’s competitors don’t seem to get the concept. For example, while I interpreted from Siemens customers that they were excited about potential use of the cloud, Siemens PLM Software, except for licensing issues, seems to have no plans to enable them. The same goes for PTC.

Let me know what you think.
—–

Industry Siemens PLM Software Analyst meeting 2011 Notes

Acting more like the tortoise than the hare in the fable; Siemens PLM Software (SPLMS) has plodded and plotted its way to the leadership position in the PLM software industry. Sticking with its “never let a customer fail” strategy as well as other newly elucidated goals has enabled the company to maintain a steady pace in gaining customers and revenue over the years.

At the Industry SPLMS Analyst meeting held on the 7th and 8th of September in Boston, Tony Affuso, CEO and Chairman, revealed substantial growth for the company. SPLMS experienced strong double-digit license revenue growth, following five previous quarters of steady growth and exceeded profitability and cash flow targets. SPLMS does not reveal its precise numerical performance. Acknowledging that most of the growth was organic (not from acquisitions) and since their growth considerably exceeded that of the market, the difference had to come from gaining market share growing their business within existing customers, from new markets, and by winning business from their competitors. Affuso discussed recent competitive wins against Dassault Systemes (DS) and PTC. His conclusion and that of the speakers that followed credited their strategies of openness and never outmoding customer data, quite different than PTC and DS. DS, in particular forced customers to endure substantial conversions in going from V4 to V5 and now to V6. At PTC, Creo appears to require a massive data migration as well. Instead, SPLMS has deployed SOA and XML to maintain a pipeline between new and existing applications. Using such “data pipelines” allows connections between disparate applications, the drawback being a modest degradation in performance for the additional work required by each application. This has been made relatively unimportant by the tremendous growth in hardware that has continued unabated all these years.

In speaking with one SPLMS executive, being a part of Siemens has allowed a stability and financial base that allows continued heavy investment in R&D, even in difficult economic times. Both Affuso and President Chuck Grindstaff cited Siemens devotion to innovation and not retreating on R&D expenditures during difficult times.

Announced last year, HD PLM seems to be making real headway. Designed as a graphic way to present Teamcenter data visually, we may at last be beginning to see the demise of the numbingly tedious display of tables of data – instead to be replaced with graphic views of the product. For instance, why look at a table of components that are over cost targets? Instead show the out-of-whack components using a color scheme on the assembly view of the product. Grindstaff described this as a way of showing the “semantics” of the product data, or relating data relationships. While I am sworn to secrecy, I can tell you that Siemens is exploring methods for the average user to generate his own views of data he might be interested in. More on this later this year.

More to come in Part 2.

—–

How Local Motors won the DARPA contest

A few weeks ago I published an article entitled “DS clarifies DARPA crowdsource win.” A few things, in my mind needed clarification. Dassault Systemes PR rep, Jessica Harrison from fama PR, arranged for me to speak with Alex Fiechter, Local Motors Engineer. I was curious, among other things, about how crowd-sourcing was used for the design and whether it was useful. I also wondered how they handled input from 12,000 community users and what was the process they used. Finally I wanted to find our more about Local Motors.

Here is how the process worked. Local Motors (LM) massaged the DARPA specs for the contest into a “brief,” a mission statement of what they desired, and posted it onto their website, asking their community members if they were interested in responding. Most of the community members are interested in industrial design and some helped LM design their Rally Fighter. Along the way, LM developed their concept for Local Forge, an open source web-based co-creation platform. Apparently, car lovers worldwide love to design shapes for cars of their dreams. Local Forge is a way for them to share their designs via images, with all other community members.

A key aspect of the mission statement was to use the existing Rally Fighter chassis as a base upon which to build the body. With the mission statement , eventually 150 to 180 proposal were submitted, from which the final design was chosen. The proposals could be in any electronic form, such as images or even CAD files. They had to show the 3 required views at a minimum. The community then voted on the submissions. Only the winning submitter gets paid. LM used SolidWorks for the mechanical design and Catia for the body design.

What next? Will it be produced? DARPA owns the design now that the contract is complete. A research arm of the DoD, the DoD may or may not choose to produce the design.

Has Local Motors discovered a new way of doing business that involves minimal plant investment, a way to solicit valuable (and mostly free) input from leading designers, and deliver an exciting new product? You be the judge. Visit some of the links from my previous article quoted above and provide some feedback via comments on this blog.

——