Dell announces new Precision Mobile workstations

24 July 2012: I was pre-briefed last Friday about Dell’s new mobile workstations. Today I am in the Shenandoah National Park in Virginia writing this blog, taking time from all this natural beauty to keep you informed about the latest technology.

Today Dell is announcing the newest entries into its Precision line of workstations – the M4700 and the M6700 mobile workstations.

While these are hardly your ordinary laptops, weighing in at about 8 pounds, they offer a high degree of portability and an awesome display of computing power, incredible graphics options, high speed memory up to 16 GB or 32 GB depending on memory speed, disk capacities of up to 3.2 terabytes, and all this with a 10 hour battery life. Nevertheless, it’s better than lugging a desktop workstation and an external display to your next meeting. This is truly a machine that I would be comfortable to use as both my desktop workstation AND and mobile workstation.

They are not cheap. Starting prices range from $1649 for the Dell Precision M4700, to $2199 for the M6700, and $3579 for the M6700 Covet. Fully decked out I could easily imagine them coming in at 100% to 150% above their base price.

Why the covet name I asked? Mano Gialusis, Dell’s Sr. Product Manager, replied that after seeing the Gorilla Glass and edge to edge display, all users will covet it!

Last year TechniCom had the opportunity to benchmark software using the M6600 and that delivered excellent performance on Autodesk Inventor and SolidWorks. While Dell was not able provide any specifics on the performance of the latest workstations, I estimate that the M6700 with the extreme I7 processor and the higher performance graphics cards should deliver from 2X to 5X the overall performance on compute intensive operations. The overall performance that users will see, naturally will depend on the specific applications being run.

A bit about the graphics options. The AMD card is new and these workstations are the first to have it. The specs are available on the Dell website. This card offers HD3D stereoscopic viewing. NVIDIA offers three cards, offering their Quadro GPU technology, 3D vision and power conservation techniques. See more details at http://www.dell.com/precision .

Dell was not able to use Intel Xeon processors because they are too big and power hungry for mobile workstations. Instead the Intel Core i7-Extreme Edition Processor provides outstanding features, such as: four 3.33 GHz cores for better multitasking and multithreaded performance, 8 MB of smart cache, an Integrated memory controller delivering high memory bandwidth, and a new Quick Path Interconnect for fast data transfer between the processor and chipset.

Conclusions:
I am very impressed with Dell’s attention to the high end engineering and graphics markets. The previous announcement a few months ago of their desk-side systems show that they understand these market needs by offering well balanced compute speeds, amazing graphics, monstrous expandability, serviceability, and reliability. These mobile workstations expand on those offerings.

These babies are hot! If you need high end, portable workstations that are truly super-computers, there is no better way to go.

http://www.dell.com/precision

Disclosure: I received no compensation for this review. All opinions are mine.
—-

My Twitter feed at PlanetPTC Live 2012 expanded with additional comments

7 Jun 2012

Introduction

I attended PlanetPTC Live 2012 as a media and analyst guest of PTC earlier this week. I was free to mingle with any user in attendance, and attend the general sessions, as were the other 75 or so media representatives. PTC also organized special sessions for the media. These sessions generally were more concise and allowed more direct interaction with PTC executives, other management and selected presenters. [Disclosure: PTC paid for my airfare and hotel accommodations.]

I tweeted during the events I attended, not prolifically as do some other tweeters, instead choosing to focus on what I found to be interesting and the highlights of some sessions. I have taken most these tweets and expanded on them below for my blog readers. In a blog to be posted soon, I might add additional comments.

In general, the conference was upbeat and well organized. With Creo and Windchill almost evenly divided in terms of revenue, the two lines of business account for some 80% of PTC revenue. The other three (ALM, SCM, and SLM) make up the balance, but represent substantial future growth areas for PTC. All three are collaborative businesses based on Windchill. SLM being the newest. With the PTC business now focused on lines of business, each with its own P&L, customers are better represented.

Tweets expanded (tweets are identified by the • symbol, followed by an expanded explanation)

  • In the exec wrap up on Tuesday, Brian Shepherd confirmed plans for an entry level Windchill. Pre-configured for smaller users.

More: While I had not heard of such an activity, some media were and asked the status of the project. As best I can recollect, this may come out in 2013. Probably one reason why Windchill ProductPoint was decommissioned last year. Remember this product, which relied on Microsoft SharePoint?

  • PTC realigns organization structure by lines of business, each with P&L responsibility. CAD, PLM, ALM, SCM, and SLM.
  • SLM is service lifecycle management. According to EVP Barry Cohen, an underserved market.
  • Mike Campbell now heading up MCAD segment. Brian Shepherd and Bill Berutti head up other 4. Development reports to EVP Rob Gremley.

More: Here are the relevant descriptions from the latest PTC company info flyer:

Rob Gremley EVP, Product Development & Corporate Marketing
Brian Shepherd EVP, PLM & SCM Segments
Bill Berutti EVP, ALM & SLM Segments
Mike Campbell Division General Manager, MCAD Segment

  • Problems reconciling EBOMs and MBOMs? Now there’s another – SBOMs. Service BOMs add parts kitting.

More: Users have struggled with developing and managing manufacturing BOMs for decades. Add a new one for managing the services practices – the Service BOM, which describes the product from a service point of view. These often contain groups of parts that may be replaced as one unit in the field.

It looks like Windchill MPMLink today manages this process for MBOMs and EBOMs in those companies that use Windchill and Creo. With PTC constructing a Service Lifecycle Management business unit, I am not sure where or how the SBOM relates to the other BOMs and how it is managed. I am sure PTC has thought this out and can provide an answer.

  • Campbell highlights Creo Layout and Freestyle as providing impetus for move to Creo.

More: These two Creo apps are new for Creo 2. Both are targeted towards giving users more easy to use modeling methods, fully integrated with Creo Parametrics. In the case of these two apps, both also play in the concept design space. PTC stressed the connection into Creo, rather that having a stand-alone concept design system, a dig I am sure meant to rattle the cage of companies using Alias (from Autodesk), today’s most widely application for industrial and concept design.

  •  PTC positions Creo 2 as opening the floodgates for Wildfire transitions. No cost to users. UI and functions better.

More: Brian Shepherd said this on the first day in the main tent session. For those of you not aware of what the term main tent is, it relates back to my days at IBM, where they called the main tent was where all the attendees were gathered together, as opposed to the breakout sessions. I guess back in the early days IBM held these sessions under tents – companies were smaller then.

  •  With release of Creo 2, PTC encouraging third parties to develop [apps]. None available from third parties yet. Opportunity to fully integrate.

More: In a follow up conversation with Brian Thompson, VP of Product Management for Creo, he stated that the requisite API’s are not fully available yet. They will be by Creo 3 and Creo 4. Creo 4, I asked! Yes, he said by Creo 4, or two years from now. Third party developers might want to clarify this directly with PTC.

  • Option modeling another approach to developing ETO configurations. Another approach to developing requirements based models?
  •  Option modeling marries Creo 2 and Windchill 10.1. Can add PLM config options based on geometric positioning.

More: Option modeling allows a concise description of a product with many variants. In some systems users plug all the variants into a parametric model containing all of the variant options. This often results in a very large model with an obscure definition of when each variant is used. Creo 2 and Windchill aim to solve this by combining the geometric properties of Creo with the data management properties of Windchill. For example, in a bicycle, all wheels are attached to hubs. Thus one need only keep track of the different wheels, along with any geometric modifications to the geometric model for the various wheels. Filters and equations are used for the definitions. I think, because I only saw a five minute video example.

  • Attending Cummins case study of integrating mfg and product intros. Closing the loop between the two.

More: Dr. Michael Grieves, author of several books on PLM, along with Apriso, revealed a startling efficiency claim for Cummins, which integrated its PLM, ERP, and MES systems. See if you can get a copy of his slides for an explanation.

  • Main tent sessions focused on Creo 2.0 and hints of what’s to come. Main business line highlighted. Campbell: great job on CAD.

More: On the first day PTC revealed what’s new with upcoming products and it vision for the future, near term.

  • Chief customer officer – Mark Hodges. Never heard of that title.

More: From Wikipedia I found out that a chief customer officer (CCO) is defined as “an executive who provides the comprehensive and authoritative view of the customer and creates corporate and customer strategy at the highest levels of the company to maximize customer acquisition, retention, and profitability.” The CCO typically reports to the chief executive officer, and is potentially a member of the board of directors.

  • High of 97 degs expected today at PlanetPTC in Orlando. Hot AND humid. Good to be inside with A/C all day.

More: Guess someone got a good discount for holding it here this time of the year.

—————–

Dell’s new line of Precision Workstations rival supercomputers

23 April 2012: Last week I attended the Dell Precision Workstation announcement in San Francisco for their new line of Precision workstations meant for professionals with a need for high amounts of computation. Under embargo until today, I am now free to explore the details of the announcement with you.

If any of you were paying attention to my tweets of last week you may have seen my expectations for this line, even before I knew what the details were. I posited that the new products, following a long tradition of hardware announcements, we’re going to be faster, more expandable, and a better price performer. All of that is true, along with some other characteristics:

  • More green
  • Quieter – 8 thermal sensors in chassis control 8 fans.
  • Reliable memory technology – alerts user to replace defective DIMM’s
  • Smaller packaging
  • Easier to work on
  • Rack mountable
  • Can store up to 8 hard drives internally
  • Processor choice: multi socket or single socket (1 or 2 Xeon E5-2600 processors); each with up to 8 cores
  • Max memory up to 512 GB
  • Offers many times the performance for about the same cost

The four new systems T7600, T5600, T3600, and the T1560 all use the last Intel Xeon processors, and support both NVIDIA graphics cards along with NVIDIA’s Tesla boards with their amazing graphic CPU’s.

Maximum expandability is enormous and will form the basic choice of which system users will buy. By now the Dell website is updated with the specs. Go to www.dell.com for the details. The Intel spokesman stated that the T7600 is faster than the fastest supercomputer of only six years ago. Absolutely incredible!

I spent a fair amount of time at dinner the night before the meeting speaking with the industrial designers, who were very excited about the design, particularly the packaging and the ease with which users can access and upgrade the internals. On the T7600 and the T5600, the motherboard is now positioned away from the chassis so that the power cabling is all on one side and the electronic connections are on the other. Very nice. The entire power supply is an isolated unit that plugs directly into the chassis, thus it can swap out in seconds. A far cry from having to mess with the power plugs, as in the past.

Dell T7600 interior with dual processors

Note that no power cables are on this side of the motherboard, which is now more towards the middle of the chassis.

Here is a look at the removable power supply, accessible from the rear of the unit.

Removable power supply

Here is what the new towers look like (the ones on the left).

The Dell Precision T1650, T3600, T5600 and T7600 (left to right in the above image)

I generally don’t cover hardware announcements, but I made the exception in this case because these workstations are clearly aimed at the engineering and rendering/animation markets. The Dell Precision T1650, T3600, T5600 and T7600 (left to right in the above image) workstations will be available for purchase worldwide starting in May.

  • The Dell Precision T7600 pricing starts at $2,149 USD
  • The Dell Precision T5600 pricing starts at $1,879 USD
  • The Dell Precision T3600 pricing starts at $1,099 USD
  • The Dell Precision T1650 pricing will be announced in May

I expect that the T7600 reasonably configured will be in the $4000 range and could go much higher by adding up to three Nvidia boards (Quadro plus up to 2 Tesla GPU boards) that are now possible with its 600 watt power supply and high speed bus access directly to the Xeon processors. Nvidia’s GPU boards, called Tesla boards contain up to 448 cores. You can find out more at http://www.nvidia.com/object/personal-supercomputing.html.

Using the full capacity of the multi-core systems requires that the software be optimized for multi-processor architecture. In conversations with Nvidia representatives, they said that programs with have tight loops, and high compute requirements while processing minimal amounts of data are ideal candidates. FEA solvers and renderers are ideal for multi-threading. By the way, if you had a chance to see the movie Hugo, a huge part was rendered. I can hardly imagine the compute cycles required.

Takeaways:

If you are a power user, this is the way to go: truly super-computer performance delivered for workstation prices.

I tried very hard to get metrics on speed comparisons, but Dell (rightly so) claimed that it was so specific to the job being run, that they were unwilling to discuss numbers. With all of the standard benchmarks out there I am a little surprised. If any of my readers want to make available their experiences on the performance side I will be happy to publish meaningful results.

While I am a Mac fan, I have several PC workstations on site. I am salivating at the chance to get my hands on a properly equipped T7600. I expect many of you are as well.

I hope the software vendors will soon expand their capabilities to match this hardware power available. For you software vendors reading this, here are some things to think about:

  • More industry specific software that needs minimal information to produce designs.
  • Fully integrated analysis during design (like spell checkers)
  • Turning well specified requirements into designs
  • Better graphical PLM systems, with full automatic selection and rendering starting at even the most complex products
  • Allowing users to work totally with 3D stereoscopic models
  • And so on . . .

 

Disclosure: Dell paid for my travel expenses to and from the meeting and hotel accommodations and meals in San Francisco.

 —

Rhino 5 Beta Features a Gumball Manipulator

15 April 2012: last week I had a chance to sit in on a webinar hosted by Novedge about the new Gumball manipulator for Rhino 5. Why it’s call the gumball manipulator I have no idea and in response to this question apparently Rhino doesn’t either. Brian James, from Robert McNeel & Associates, presented the webinar hosted by Novedge.

Using the gumball manipulator, allowed for a very impressive list of capabilities to modify surfaces and curves directly.

Here is a view of the manipulator, selected to operate on the yellow curve.

Gumball Manipulator

It can perform translation, rotation, and scaling on the selected object. It can also be used to create geometry. This curve can be used to create the first solid, as shown below.

Scaling and translating the top face to modify a solid.

Eventually, using the gumball, other Rhino functions and a few other curves, the presenter created this faucet.

The faucet

The additional curves in the image below will be used to generate the sink shown below.

The final sink model

This was all pretty impressive and demonstrated that Rhino is continuing to develop their software into a “solid” CAD system featuring advanced curves, surfaces, and solids, as well as having a unique UI. All at a modest price.

Take a look for yourself via the recorded webinar.

More info:

Link to the recorded webinar:

http://www.rhinojungle.com/video/novedge-webinar-series-episode-43-rhino-5-overview-featuring-the

www.novedge.com, a leading on-line superstore has lots of video demos available for many products.

Scan and Solve offers meshing-less FEA

19 March 2012: Recently I had a chance to sit in on a demo of Scan&Solve™, software promising to (virtually) automatically solve parts for linear stress FEA analysis without any concern about meshing the part. Used in conjunction with solid models of parts developed with Rhino, the demo did just that. To verify the accuracy of the results, the demoer adjusted something called the resolution of the “geometry scan” of the part. Adjusting the resolution showed that the accuracy was converging. Wow! I thought. Time to find out more about how this worked, how extensible it was, how it differed from traditional FEA, and its cost. I went to the company website and soon located the founder of Intact Solutions LLC, the company that authored the software – Vadim Shapiro, Professor of Mechanical Engineering and Computer Sciences at the University of Wisconsin-Madison.

Right away, I figured uh oh, an academic. They are not always known for providing crisp answers. Nevertheless I requested and was granted an interview with Dr. Shapiro. He turned out to be very open and enthusiastic about the product.

Vadim Shapiro, Intact Solutions founder

Here is what Scan&Solve does differently than traditional FEA meshers and related solvers:

  • Scan&Solve is not a replacement for FEA; it is an extension of FEA, which aims specifically to solve the problem of CAD/CAE interoperability. Any reasonable geometric kernel and any reasonable FEA package can be interfaced with great benefits. The goals of the product are simplicity, universality, and complete automation.
  • The current version analyzes parts only, not assemblies.
  • Instead of meshing, the software assigns an analysis space (a grid) surrounding the part to be worked on, as shown below:

How Scan&Solve works

  • Scan&Solve needs to interface with the CAD system to supply coordinates of the model to Scan&Solve for each point on the grid. Given this interface between the CAD system and S&S, there is no need for a mesh to be created. Instead the software works with the precise model geometry. Scan&Solve directly modifies the basis functions, sometimes called “shape functions” — functions that approximate the solution of the problem. In the current implementation, these basis functions are associated not with vertices of the mesh, but with cells in the mesh (of the space, not of geometry). “Modify functions” means that they are modified to satisfy the applied restraints everywhere — not just at vertices. Scan&Solve™ can be applied to any geometric model and used within any geometric modeling system that supports two fundamental queries: point membership testing and distance to boundary computation.
  • No simplification or de-featuring of the model is needed.
  • Increasing the resolution of the grid can test convergence of the results. If a higher resolution produces large changes in the results, keep increasing the resolution. Shapiro noted, “The issue is essentially the same as with standard FEA. One can estimate the error and refine the mesh (or increase density in our case), but it is more or less the same for all techniques. We do not do anything automatically right now. We advise the users to run at different resolutions (which requires NO WORK from the user) and compare the results. If results are significantly different, increase the resolution. In principle, this can and will be automated in the future.”
  • Can work directly with polygonal models. Scan&Solve performs all analysis related computations on the native geometry (whether polygonal, NURBS, or other form of geometry). Shapiro stated that “This eliminates the need for preprocessing: no healing, smoothing, de-featuring, or meshing is needed. This drastically reduces preparation/set up time.” However, the commercial product in Rhino works only with NURBS solids.
  • It always produces results. Shapiro stated “The solution procedure is deterministic, does not use heuristics, and always produces a result. (In other words, failure means a bug in the code: not inability to handle some geometry.) The advantages of S&S are full automation, complete integration and interoperability. Use it at any stage of the design process: from concept creation to detailed geometry.”
  • Prices are very reasonable. Scan&Solve for Rhino commercial licenses are $695 for a node locked version and $1295 for a floating license. Academic, trial and rental licenses are also available. Scan&Solve for Rhino also requires a Rhino license.
  • Interfaces are available currently for a limited number of CAD systems. Scan&Solve can be applied to any geometric model and used within any geometric modeling system that supports two fundamental queries: point membership testing and distance to boundary computation.

References: http://www.intact-solutions.com/

http://www.scan-and-solve.com/

Disclosure: No remuneration of any kind was paid for this article.

Conclusion: Both CAD and FEA vendors should check out the possibility of offering this technology as an option for users. With trial copies available from both Rhino and Intact Solutions, users wanting to extend FEA analysis beyond the traditional analysis experts should consider the benefits and urge their CAD partners to investigate this alternative.

Does SIRI signify a common CAD UI for the future?

I think, after using it on my iPhone 4S for a few weeks that the answer is definitely YES. And, this is especially true for CAD applications. All CAD apps require a complex series of user interactions, usually performed in a rigid manner to proceed through the process. How nice would it be to just speak what you want done and actually have the computer do the dirty work of interpreting the commands?

Just think. No more menus needed. No more searching multiple menu windows for the precise command needed. No more focusing in on tiny graphics on commands. No need to worry about sequencing the command.

On my iPhone I can just say “make an appointment with Bob for tomorrow at 2:00.” not too different than “draw an infinite horizontal line tangent to circle A” when Siri needs more info it asks for it. In my query above it might say “do you mean Bob Albert, Bobby Jones, or Bob Smith?” How cool is that?

I suggest vendors immediately get busy finding smart speech recognition software.

What do you think?

——-

Autodesk 360 and Nexus – PLM 1.0: not perfect – but a great start

3 Dec 2011: Errata. I was incorrect in stating that Buzzsaw was a local PDM vault for AEC/BIM. Several people have written me about this, one being Stephen Bodnar of Autodesk. Bodnar stated that “Vault is the on-premise DM solution for both industries, whereas Buzzsaw is cloud-based and is also built on Autodesk’s Cloud, and is intended for design file collaboration between partners/suppliers and other users and does, in fact, have bi-directional push/synchronization with Vault)”

1 Dec 2011: I am on my way back from Las Vegas, where AU 2011 was held. The highlight of the event, at least for me, was the announcement of what I am calling Autodesk PLM 1.0. The announcement was not a well-kept secret, but the content of the announcement was closely held.

Monday’s media day preceded the conference. The actual PLM announcement came late Tuesday morning. Carl Bass retracted his oft quoted remark about PLM not being something customers worried about; instead, it was revised to mean “until the technology was right.” I couldn’t agree more with his reasoning. Most of Autodesk’s competitors PLM systems offer expensive, difficult to use, and almost impossible to install PLM systems, that rarely have met expectations. Even then, it is often at the cost of massive consulting assistance, rarely meeting anticipated timeframes, AND generally involves the implementation of substantially revised business processes.

Different than my analyst peers I have always been skeptical of such large and costly projects. Not being on the implementation side, I could afford to be skeptical. Many such projects, aside from basic PDM, seldom actually get implemented. Most stall. Autodesk estimates that most deliver only PDM. To test this thesis, I tweeted my followers and asked what they had accomplished. With just a few responses, this is hardly scientific. Several stated that did not yet have even PDM fully implemented!

So what was actually announced? The system is being called Autodesk 360. It is based on having locally installed PDM. For mechanical and for AEC this is Vault. Buzzsaw, a cloud based application provides design file collaboration for AEC teams. The third, and new software piece is called Nexus. The dictionary describes the word nexus as a “connector.,” and is a good description of what the software aims to do. In the following discussion I concentrate solely on mechanical PLM. For information on Buzzsaw and how it uses Nexus readers will have to go elsewhere. Try here.

Nexus is cloud based, and comes with 140 or apps. Each app looks like a series of specialized templates, along with customizable (by the user) workflow logic. Delivery is expected by the end of March 2012. No pricing was announced, however, the implications were that it would be modest. It will be sold on a per user subscription basis. All Nexus data and apps will be run in the cloud, using an ordinary browser. The mass of data will remain locally hosted using Vault. Having and maintaining Vault locally solves the issue of loading very large cloud based data while still maintaing some degree of interactivity.

How will it interface with Vault and other PDM systems? Very well with Vault. No connectors were announced to integrate with other PDM systems. Autodesk hinted that this is a good opportunity for third party developers and VARs. Connections with Nexus could be implemented via as yet unannounced APIs.

Today, the connection between Vault and Nexus is one way. CAD data cannot be sent from Nexus to Vault. Nor is it synchronized among Vaults, as is done among Apple’s iCloud apps. However, Vault data is automatically synced up to Nexus. Expect bi-directional sync in the future.

Is it easy to install and operate?

Keep in mind that my total exposure to Autodesk 360 Nexus comes from a 30 minute, main stage presentation, followed by a 60 minute working session where about 20 people per workstation watched a very capable Autodesk developer demo and responded to questions, often by showing us how Nexus would solve the proposed question.

Nexus appears to be an out of the box system. Nexus comes with predefined templates and workflows. Yet they can easily be added to and/or modified. Fields within templates (apps) can be defined on the fly and their characteristics (such as numeric, values, dates, etc.) as well. A Visio like graphic interface defines workflows. Many are offered in the starter system. A typical administration system allows assigning users to tasks and roles. Somehow, data fields can be interconnected, allowing visibility to see what drives or is driven by what.

So. There you have it. I imagine Autodesk will soon, if not already, have many seminars and pre-recorded AVI’s showing the software. Try here: http://usa.autodesk.com/360-lifecycle-management-software/

My conclusions

I think the product is outstanding. Being cloud based resolves many operating issues. Some users might question the security aspects of hosting much of the data remotely, and would do well to satisfy themselves that either this is not an issue, or otherwise. I think, that perhaps except for very special circumstances, the cloud-based security might even be vastly superior to what they could do locally. I think this is a non-issue.

Cost wise, I think this will prove to be much less expensive, long term, than most of today’s solutions. Again, this is a non-issue. Just take a look at the slide Stephen Bodnar of Autodesk, VP of Data Management, presented below that compares some costs for a 200 user deployment.

For collaboration, data can be uploaded, either in summary format, or detailed CAD files. Nexus has controls over what user sees what data.

Included are project management capabilities that allow rolling up from completed sub-tasks automatically. Defining projects involves defining sub-projects with easily configurable tasks and reporting procedures. If you have already implemented workflow as part of Vault, then is should be redone using Nexus. It allows more flexibility and better visibility.

If you want visibility by projects, by project managers and contributors, with flexibility to change workflows and processes to meet how you do business, it’s all there. My only question is how soon can I get it?

Ray with his skeptical face during AU2011 —-

—-

Here are a few slides from the presentation to give you an idea of what Autodesk presented. Sorry for the quality – I used my phone.

The overall concept of Autodesk 360.

Stephen Bodnar discussing their view of PLM:

Why is it called 360? Showing how the Vault and Buzzsaw make up local PDM systems:

Brenda Discher discussing why users don’t like competitive PDM systems.

What Autodesk is doing about it with Nexus.