Stereoscopic 3D – it works!

I noticed that Randall Newton, in his Graphic Speak newsletter of 16 July 2012, commented on the Dimension 3 conference in Paris that focuses on digital 3D. He stated that “not all is well in 3D Land, reports Kathleen Maher; total revenue from 3D movies is dropping, despite blockbusters that do well. 3D in the home is still an expensive novelty, with few choices for consumers.” http://gfxspeak.com

I want to give you some feedback on my own experience with 3D TV and how it changed my perception.

On a recent trip to visit my son, it turns out he had a room added to his house and was in need of a new HDTV. He asked me to help him make a selection. Being somewhat of an electronics geek, I happily agreed, particularly since he was paying.

Off we went to Costco to browse their selection of LED TV’s. One that instantly struck our eyes was a new 3D LED HDTV by Samsung, a 55 inch TV from their new ES Series. While this was almost $400 more than we expected to spend, after looking at it and the rest of the offerings, my grandson lobbied hard for this TV. Even without seeing 3D on this set it blew away anything else in the store. He coupled this with a 3D blu-ray player by Samsung. My contribution was to buy a 3D movie (John Conner 2012) so we could test it out at home.

The hookups were easy, so after about half an hour of moving everything around and connecting all the wires, we turned it on and were literally blown away by the 3D experience! Lightweight active glasses seemed to avoid headaches I typically have gotten from passive glasses.

The picture was crisp and with amazing 3D. I didn’t quite flinch when things appeared to come out of the screen, but came close a few times. I had seen the movie before in 2D and was amazed by the difference. More lifelike. More compelling. Great images.

My conclusion: the next TV I buy will be one with 3D. I wonder why all CAD vendors don’t fully support 3D? We have 3D input devices. We have 3D printers. Yet, few vendors offer 3D display software for graphics design. Why?

My Twitter feed at PlanetPTC Live 2012 expanded with additional comments

7 Jun 2012

Introduction

I attended PlanetPTC Live 2012 as a media and analyst guest of PTC earlier this week. I was free to mingle with any user in attendance, and attend the general sessions, as were the other 75 or so media representatives. PTC also organized special sessions for the media. These sessions generally were more concise and allowed more direct interaction with PTC executives, other management and selected presenters. [Disclosure: PTC paid for my airfare and hotel accommodations.]

I tweeted during the events I attended, not prolifically as do some other tweeters, instead choosing to focus on what I found to be interesting and the highlights of some sessions. I have taken most these tweets and expanded on them below for my blog readers. In a blog to be posted soon, I might add additional comments.

In general, the conference was upbeat and well organized. With Creo and Windchill almost evenly divided in terms of revenue, the two lines of business account for some 80% of PTC revenue. The other three (ALM, SCM, and SLM) make up the balance, but represent substantial future growth areas for PTC. All three are collaborative businesses based on Windchill. SLM being the newest. With the PTC business now focused on lines of business, each with its own P&L, customers are better represented.

Tweets expanded (tweets are identified by the • symbol, followed by an expanded explanation)

  • In the exec wrap up on Tuesday, Brian Shepherd confirmed plans for an entry level Windchill. Pre-configured for smaller users.

More: While I had not heard of such an activity, some media were and asked the status of the project. As best I can recollect, this may come out in 2013. Probably one reason why Windchill ProductPoint was decommissioned last year. Remember this product, which relied on Microsoft SharePoint?

  • PTC realigns organization structure by lines of business, each with P&L responsibility. CAD, PLM, ALM, SCM, and SLM.
  • SLM is service lifecycle management. According to EVP Barry Cohen, an underserved market.
  • Mike Campbell now heading up MCAD segment. Brian Shepherd and Bill Berutti head up other 4. Development reports to EVP Rob Gremley.

More: Here are the relevant descriptions from the latest PTC company info flyer:

Rob Gremley EVP, Product Development & Corporate Marketing
Brian Shepherd EVP, PLM & SCM Segments
Bill Berutti EVP, ALM & SLM Segments
Mike Campbell Division General Manager, MCAD Segment

  • Problems reconciling EBOMs and MBOMs? Now there’s another – SBOMs. Service BOMs add parts kitting.

More: Users have struggled with developing and managing manufacturing BOMs for decades. Add a new one for managing the services practices – the Service BOM, which describes the product from a service point of view. These often contain groups of parts that may be replaced as one unit in the field.

It looks like Windchill MPMLink today manages this process for MBOMs and EBOMs in those companies that use Windchill and Creo. With PTC constructing a Service Lifecycle Management business unit, I am not sure where or how the SBOM relates to the other BOMs and how it is managed. I am sure PTC has thought this out and can provide an answer.

  • Campbell highlights Creo Layout and Freestyle as providing impetus for move to Creo.

More: These two Creo apps are new for Creo 2. Both are targeted towards giving users more easy to use modeling methods, fully integrated with Creo Parametrics. In the case of these two apps, both also play in the concept design space. PTC stressed the connection into Creo, rather that having a stand-alone concept design system, a dig I am sure meant to rattle the cage of companies using Alias (from Autodesk), today’s most widely application for industrial and concept design.

  •  PTC positions Creo 2 as opening the floodgates for Wildfire transitions. No cost to users. UI and functions better.

More: Brian Shepherd said this on the first day in the main tent session. For those of you not aware of what the term main tent is, it relates back to my days at IBM, where they called the main tent was where all the attendees were gathered together, as opposed to the breakout sessions. I guess back in the early days IBM held these sessions under tents – companies were smaller then.

  •  With release of Creo 2, PTC encouraging third parties to develop [apps]. None available from third parties yet. Opportunity to fully integrate.

More: In a follow up conversation with Brian Thompson, VP of Product Management for Creo, he stated that the requisite API’s are not fully available yet. They will be by Creo 3 and Creo 4. Creo 4, I asked! Yes, he said by Creo 4, or two years from now. Third party developers might want to clarify this directly with PTC.

  • Option modeling another approach to developing ETO configurations. Another approach to developing requirements based models?
  •  Option modeling marries Creo 2 and Windchill 10.1. Can add PLM config options based on geometric positioning.

More: Option modeling allows a concise description of a product with many variants. In some systems users plug all the variants into a parametric model containing all of the variant options. This often results in a very large model with an obscure definition of when each variant is used. Creo 2 and Windchill aim to solve this by combining the geometric properties of Creo with the data management properties of Windchill. For example, in a bicycle, all wheels are attached to hubs. Thus one need only keep track of the different wheels, along with any geometric modifications to the geometric model for the various wheels. Filters and equations are used for the definitions. I think, because I only saw a five minute video example.

  • Attending Cummins case study of integrating mfg and product intros. Closing the loop between the two.

More: Dr. Michael Grieves, author of several books on PLM, along with Apriso, revealed a startling efficiency claim for Cummins, which integrated its PLM, ERP, and MES systems. See if you can get a copy of his slides for an explanation.

  • Main tent sessions focused on Creo 2.0 and hints of what’s to come. Main business line highlighted. Campbell: great job on CAD.

More: On the first day PTC revealed what’s new with upcoming products and it vision for the future, near term.

  • Chief customer officer – Mark Hodges. Never heard of that title.

More: From Wikipedia I found out that a chief customer officer (CCO) is defined as “an executive who provides the comprehensive and authoritative view of the customer and creates corporate and customer strategy at the highest levels of the company to maximize customer acquisition, retention, and profitability.” The CCO typically reports to the chief executive officer, and is potentially a member of the board of directors.

  • High of 97 degs expected today at PlanetPTC in Orlando. Hot AND humid. Good to be inside with A/C all day.

More: Guess someone got a good discount for holding it here this time of the year.

—————–

Dell’s new line of Precision Workstations rival supercomputers

23 April 2012: Last week I attended the Dell Precision Workstation announcement in San Francisco for their new line of Precision workstations meant for professionals with a need for high amounts of computation. Under embargo until today, I am now free to explore the details of the announcement with you.

If any of you were paying attention to my tweets of last week you may have seen my expectations for this line, even before I knew what the details were. I posited that the new products, following a long tradition of hardware announcements, we’re going to be faster, more expandable, and a better price performer. All of that is true, along with some other characteristics:

  • More green
  • Quieter – 8 thermal sensors in chassis control 8 fans.
  • Reliable memory technology – alerts user to replace defective DIMM’s
  • Smaller packaging
  • Easier to work on
  • Rack mountable
  • Can store up to 8 hard drives internally
  • Processor choice: multi socket or single socket (1 or 2 Xeon E5-2600 processors); each with up to 8 cores
  • Max memory up to 512 GB
  • Offers many times the performance for about the same cost

The four new systems T7600, T5600, T3600, and the T1560 all use the last Intel Xeon processors, and support both NVIDIA graphics cards along with NVIDIA’s Tesla boards with their amazing graphic CPU’s.

Maximum expandability is enormous and will form the basic choice of which system users will buy. By now the Dell website is updated with the specs. Go to www.dell.com for the details. The Intel spokesman stated that the T7600 is faster than the fastest supercomputer of only six years ago. Absolutely incredible!

I spent a fair amount of time at dinner the night before the meeting speaking with the industrial designers, who were very excited about the design, particularly the packaging and the ease with which users can access and upgrade the internals. On the T7600 and the T5600, the motherboard is now positioned away from the chassis so that the power cabling is all on one side and the electronic connections are on the other. Very nice. The entire power supply is an isolated unit that plugs directly into the chassis, thus it can swap out in seconds. A far cry from having to mess with the power plugs, as in the past.

Dell T7600 interior with dual processors

Note that no power cables are on this side of the motherboard, which is now more towards the middle of the chassis.

Here is a look at the removable power supply, accessible from the rear of the unit.

Removable power supply

Here is what the new towers look like (the ones on the left).

The Dell Precision T1650, T3600, T5600 and T7600 (left to right in the above image)

I generally don’t cover hardware announcements, but I made the exception in this case because these workstations are clearly aimed at the engineering and rendering/animation markets. The Dell Precision T1650, T3600, T5600 and T7600 (left to right in the above image) workstations will be available for purchase worldwide starting in May.

  • The Dell Precision T7600 pricing starts at $2,149 USD
  • The Dell Precision T5600 pricing starts at $1,879 USD
  • The Dell Precision T3600 pricing starts at $1,099 USD
  • The Dell Precision T1650 pricing will be announced in May

I expect that the T7600 reasonably configured will be in the $4000 range and could go much higher by adding up to three Nvidia boards (Quadro plus up to 2 Tesla GPU boards) that are now possible with its 600 watt power supply and high speed bus access directly to the Xeon processors. Nvidia’s GPU boards, called Tesla boards contain up to 448 cores. You can find out more at http://www.nvidia.com/object/personal-supercomputing.html.

Using the full capacity of the multi-core systems requires that the software be optimized for multi-processor architecture. In conversations with Nvidia representatives, they said that programs with have tight loops, and high compute requirements while processing minimal amounts of data are ideal candidates. FEA solvers and renderers are ideal for multi-threading. By the way, if you had a chance to see the movie Hugo, a huge part was rendered. I can hardly imagine the compute cycles required.

Takeaways:

If you are a power user, this is the way to go: truly super-computer performance delivered for workstation prices.

I tried very hard to get metrics on speed comparisons, but Dell (rightly so) claimed that it was so specific to the job being run, that they were unwilling to discuss numbers. With all of the standard benchmarks out there I am a little surprised. If any of my readers want to make available their experiences on the performance side I will be happy to publish meaningful results.

While I am a Mac fan, I have several PC workstations on site. I am salivating at the chance to get my hands on a properly equipped T7600. I expect many of you are as well.

I hope the software vendors will soon expand their capabilities to match this hardware power available. For you software vendors reading this, here are some things to think about:

  • More industry specific software that needs minimal information to produce designs.
  • Fully integrated analysis during design (like spell checkers)
  • Turning well specified requirements into designs
  • Better graphical PLM systems, with full automatic selection and rendering starting at even the most complex products
  • Allowing users to work totally with 3D stereoscopic models
  • And so on . . .

 

Disclosure: Dell paid for my travel expenses to and from the meeting and hotel accommodations and meals in San Francisco.

 —

Rhino 5 Beta Features a Gumball Manipulator

15 April 2012: last week I had a chance to sit in on a webinar hosted by Novedge about the new Gumball manipulator for Rhino 5. Why it’s call the gumball manipulator I have no idea and in response to this question apparently Rhino doesn’t either. Brian James, from Robert McNeel & Associates, presented the webinar hosted by Novedge.

Using the gumball manipulator, allowed for a very impressive list of capabilities to modify surfaces and curves directly.

Here is a view of the manipulator, selected to operate on the yellow curve.

Gumball Manipulator

It can perform translation, rotation, and scaling on the selected object. It can also be used to create geometry. This curve can be used to create the first solid, as shown below.

Scaling and translating the top face to modify a solid.

Eventually, using the gumball, other Rhino functions and a few other curves, the presenter created this faucet.

The faucet

The additional curves in the image below will be used to generate the sink shown below.

The final sink model

This was all pretty impressive and demonstrated that Rhino is continuing to develop their software into a “solid” CAD system featuring advanced curves, surfaces, and solids, as well as having a unique UI. All at a modest price.

Take a look for yourself via the recorded webinar.

More info:

Link to the recorded webinar:

http://www.rhinojungle.com/video/novedge-webinar-series-episode-43-rhino-5-overview-featuring-the

www.novedge.com, a leading on-line superstore has lots of video demos available for many products.

Scan and Solve offers meshing-less FEA

19 March 2012: Recently I had a chance to sit in on a demo of Scan&Solve™, software promising to (virtually) automatically solve parts for linear stress FEA analysis without any concern about meshing the part. Used in conjunction with solid models of parts developed with Rhino, the demo did just that. To verify the accuracy of the results, the demoer adjusted something called the resolution of the “geometry scan” of the part. Adjusting the resolution showed that the accuracy was converging. Wow! I thought. Time to find out more about how this worked, how extensible it was, how it differed from traditional FEA, and its cost. I went to the company website and soon located the founder of Intact Solutions LLC, the company that authored the software – Vadim Shapiro, Professor of Mechanical Engineering and Computer Sciences at the University of Wisconsin-Madison.

Right away, I figured uh oh, an academic. They are not always known for providing crisp answers. Nevertheless I requested and was granted an interview with Dr. Shapiro. He turned out to be very open and enthusiastic about the product.

Vadim Shapiro, Intact Solutions founder

Here is what Scan&Solve does differently than traditional FEA meshers and related solvers:

  • Scan&Solve is not a replacement for FEA; it is an extension of FEA, which aims specifically to solve the problem of CAD/CAE interoperability. Any reasonable geometric kernel and any reasonable FEA package can be interfaced with great benefits. The goals of the product are simplicity, universality, and complete automation.
  • The current version analyzes parts only, not assemblies.
  • Instead of meshing, the software assigns an analysis space (a grid) surrounding the part to be worked on, as shown below:

How Scan&Solve works

  • Scan&Solve needs to interface with the CAD system to supply coordinates of the model to Scan&Solve for each point on the grid. Given this interface between the CAD system and S&S, there is no need for a mesh to be created. Instead the software works with the precise model geometry. Scan&Solve directly modifies the basis functions, sometimes called “shape functions” — functions that approximate the solution of the problem. In the current implementation, these basis functions are associated not with vertices of the mesh, but with cells in the mesh (of the space, not of geometry). “Modify functions” means that they are modified to satisfy the applied restraints everywhere — not just at vertices. Scan&Solve™ can be applied to any geometric model and used within any geometric modeling system that supports two fundamental queries: point membership testing and distance to boundary computation.
  • No simplification or de-featuring of the model is needed.
  • Increasing the resolution of the grid can test convergence of the results. If a higher resolution produces large changes in the results, keep increasing the resolution. Shapiro noted, “The issue is essentially the same as with standard FEA. One can estimate the error and refine the mesh (or increase density in our case), but it is more or less the same for all techniques. We do not do anything automatically right now. We advise the users to run at different resolutions (which requires NO WORK from the user) and compare the results. If results are significantly different, increase the resolution. In principle, this can and will be automated in the future.”
  • Can work directly with polygonal models. Scan&Solve performs all analysis related computations on the native geometry (whether polygonal, NURBS, or other form of geometry). Shapiro stated that “This eliminates the need for preprocessing: no healing, smoothing, de-featuring, or meshing is needed. This drastically reduces preparation/set up time.” However, the commercial product in Rhino works only with NURBS solids.
  • It always produces results. Shapiro stated “The solution procedure is deterministic, does not use heuristics, and always produces a result. (In other words, failure means a bug in the code: not inability to handle some geometry.) The advantages of S&S are full automation, complete integration and interoperability. Use it at any stage of the design process: from concept creation to detailed geometry.”
  • Prices are very reasonable. Scan&Solve for Rhino commercial licenses are $695 for a node locked version and $1295 for a floating license. Academic, trial and rental licenses are also available. Scan&Solve for Rhino also requires a Rhino license.
  • Interfaces are available currently for a limited number of CAD systems. Scan&Solve can be applied to any geometric model and used within any geometric modeling system that supports two fundamental queries: point membership testing and distance to boundary computation.

References: http://www.intact-solutions.com/

http://www.scan-and-solve.com/

Disclosure: No remuneration of any kind was paid for this article.

Conclusion: Both CAD and FEA vendors should check out the possibility of offering this technology as an option for users. With trial copies available from both Rhino and Intact Solutions, users wanting to extend FEA analysis beyond the traditional analysis experts should consider the benefits and urge their CAD partners to investigate this alternative.

Does SIRI signify a common CAD UI for the future?

I think, after using it on my iPhone 4S for a few weeks that the answer is definitely YES. And, this is especially true for CAD applications. All CAD apps require a complex series of user interactions, usually performed in a rigid manner to proceed through the process. How nice would it be to just speak what you want done and actually have the computer do the dirty work of interpreting the commands?

Just think. No more menus needed. No more searching multiple menu windows for the precise command needed. No more focusing in on tiny graphics on commands. No need to worry about sequencing the command.

On my iPhone I can just say “make an appointment with Bob for tomorrow at 2:00.” not too different than “draw an infinite horizontal line tangent to circle A” when Siri needs more info it asks for it. In my query above it might say “do you mean Bob Albert, Bobby Jones, or Bob Smith?” How cool is that?

I suggest vendors immediately get busy finding smart speech recognition software.

What do you think?

——-

Siemens NX CAE Symposium: Users show their love

27 Nov 2011: The week before last I attended an invitation only event in Charlotte, NC, as a guest of Siemens at their first NX CAE Symposium. Designed as a way for users to get together to exchange ideas about how they use NX CAE software, some 80 customers attended the symposium, held at the Joe Gibbs Racing Facility just outside Charlotte.

The overall consensus of the presenters and the attendees I spoke with was satisfaction with the NX CAE suite. Many complimented the breadth of the CAE software, some of which I summarize below. Overall users were most satisfied because of the inherent associativity of CAE models with design models.

Several users told stories about how, in the past, they were asked by the design team to evaluate designs and get back to them. Even with an integrated system, the CAE analysts often spend substantial amounts of time simplifying models, insuring that the mesh is adequate for an accurate design, performing a series of analyses, and making recommendations to the design team, only to find that the design team has moved way beyond the design they were working on. Thus their work had to be scrapped. NX’s CAE and design integrations allow analysts to work on the design model, thus having a better ability to stay synchronized with the design team.

Also, NX seems to play well with external solvers, often integrating them tightly into the design stream workflow. Among these were Ansys solvers as well as specialized fluids solvers, such as those from MAYA.

My reactions:
Siemens PLM Software has a well-focused and wide breadth of solutions for heavy-duty CAE experts. Jon Heidorn, Siemens PLM Software (SPLMS) Vice President welcomed the attendees, stressing that simulation is one of their fastest growing markets, encompassing integrated modeling and solutions, system level modeling, multi-discipline simulation and optimization, and the intensely complex simulation data and process management. Beyond 2010 Heidorn predicted software would be available that would perform topology optimization. SPLMS also announced that their partnership with Joe Gibbs Racing was extended to 2016.

Mark Bringle and Nelson Cosgrove of Joe Gibbs Racing discussed their facility and their focus on engineering. Building their cars from scratch, and their engines almost from scratch, but carefully following NASCAR rules for each car, provides an impetus to carefully hone each major subsystem for optimal performance. Fascinatingly, their design cycle during racing season is one week! The three main groups include chassis and vehicle dynamics, aerodynamics, and powertrain. The latest version of NX allows for full chassis FEA modeling. With NASCAR demanding similar car frames and engine performance, their engineers carefully analyze every part to improve weight and aero performance so they can achieve even small advantages over their competition.

Jim Rusk of Siemens PLM Software discussed the latest trends in product development with NX CAE Simulation. He highlighted a few concepts they are working on sand delivering to make it easier than ever. Among these are Synchronous Technology for the CA analyst which makes for easier simplification, workflows for the advanced analyst, continuing improvements in multi-discipline analysis, motion analysis for flexible bodies like springs, multi-solver support, topology optimizations, and HD3D requirements management and validation.

ATK Aerospace, MDA of Canada, and JPL, Proctor and Gamble, and Solar Technologies spoke about their analyses ranging from rocket design to cryo engineering of spacecraft to making 1 million paper diapers to designing complex solar collectors.

Hendrick Motorsports’, Charles Macdonald, discussed detailed part analysis and the tradeoffs they make for lighter, yet strong and most of all highly serviceable parts of a suspension.

Kendra Short, of JPL and the mechanical manager of the Mars Science Laboratory (MSL), successfully launched just two days ago, spoke eloquently about how having a sophisticated analysis system working directly on the design model enable them to perform many more complex analyses than would have been possible without simulation done directly on the design models. Without the ability to service the MSL (it’s a long trip to Mars), Ms. Short chatted about the enormous planning that goes into having multiple alternatives in the event of a failure. I found fascinating during a break discussion about how the MSL is to be deployed to the surface using a tether. No backup here, just reliable explosive bolts.

One of the symposiums objectives was to have users exchange ideas about how they use simulation. This seemed to be more than fulfilled. If you have a chance to attend the next symposium, don’t miss it.

Disclosure: Siemens paid for my travel expenses to attend the event.