Face Triangulation LOD, .NET 5 and Core

Today, we discuss cores, splinters and data:

Using .NET 5 and Core

Olli Kattelus, MEP Software Engineer of the Finnish MagiCAD Group, updated the Revit API discussion forum question does Revit target .NET standard to include coverage of .NET 5:

Question: I would like to use .NET Core to build my plugin, but I'm a bit confused whether it will cause issues or not. From my understanding of .NET, if Revit targets a specific .NET standard, I will be able use any implementation of .NET (.NET Core, or .NET Framework) that conforms to that standard. Does Revit target versions of .NET standards or does it just target versions of .NET Framework?

Answer: This is clearly defined in the Revit 2021 API development requirements and getting started page:

The Revit Platform API is fully accessible by any language compatible with the Microsoft .NET Framework 4.8, such as Visual C# or Visual Basic .NET (VB.NET). Both Visual C# and VB.NET are commonly used to develop Revit Platform API applications. However, the focus of this manual is developing applications using Visual C#.

If your .NET core is compatible with the Microsoft .NET Framework 4.8, all should be fine.

Question: Will we be able to use the Revit API with .NET 5?

Answer: The factory should try to be prepared to answer. So far, there are complications. Revit consumes Autodesk-wide .NET components. We'd need to ensure that those are .NET 5 compatible before we switch our runtime. Revit API also runs in-process using Revit's runtime, so I'm not sure it would be possible to preserve Revit's 4.8 runtime and allow add-in code to run .NET 5.0. We have not tested either scenario yet.

Basically, developers will be able to reference Revit's dlls (.NET 4.8) in their .NET 5 projects, but there is no guarantee that everything will work. Some (if not most) things might, but I would not recommend going that way. The problem is that .NET 5 is based on .NET Core, not on the big .NET framework, and there are some incompatibilities.

As for switching Revit to .NET 5, that's something we will definitely need to do as .NET 4.8 is the last version of the big .NET. However, the switch is not as trivial as changing the version dropdown (like it was from 4.7 to 4.8). We will have to convert to a new project format, fix some code and possibly find replacement for some frameworks that were present in .NET 4.8, and are not anymore in .NET 5.

To understand more about .NET 5 Core and Framework enhancements, I found the official Microsoft overview of What's new in .NET 5 pretty illuminating.

Using .NET Standard 2.0 for Revit Add-Ins

Thiago Almeida added a helpful comment clarifying how to proceed:

It is possible to target projects with .Net Standard to compile shared code between .Net Core and .Net Framework.

.Net Framework 4.6.1 is compatible with .Net Standard 2.0, which covers most of the common used APIs across the board. The actual recommendation is to start from .Net Framework 4.7.2 as for consideration (2) on Microsoft's compatibility sheet (search for ".NET implementation support").

Translating to Revit API add-ins: You should be able to share code between Core platforms (.Net Core 3.1 or .Net 5) with Revit 2019 and above (.Net Framework 4.7.2) using .Net Standard 2.0.

Thank you very much for that, Thiago!

Apple core

Controlling Face Triangulation LOD

A very useful solution for the desktop Revit API came about after observing a significantly different level of detail triangulating a face in a Revit add-in running in Forge design automation:

Question: I am working on a Revit add-in that I modified to run in the Forge design automation environment. I am using a CustomExporter to export a Revit model to an obj file. It uses the Triangulate method on the Autodesk.Revit.DB.Face class to triangulate some of the faces in the model at a certain level of detail (LOD). The LOD value is passed to the Face.Triangulate method. I noticed that when running the add-in in design automation on Forge, the triangle count of the exported model is much higher, compared to when it is run locally in the desktop on my computer. I would like the detail of the model to be the same when exported on Forge as when it is exported on a desktop PC. Why this might be happening and how could it be fixed?

Answer: My first suspicion was that the LOD is affected by the graphics screen properties. In the Forge environment, no real screen is attached, and that causes a much higher resolution to be assumed.

The development team respond:

In our CustomExporter, we set the value of ViewNode.LevelOfDetail in OnViewBegin. However, we don't know what this does in comparison to Face.Triangulate.

In the version of Face.Triangulate that takes a levelOfDetail input of type double, that input controls the granularity of the triangulation. levelOfDetail should lie in the range [0.0, 1.0], with 0.0 being the coarsest and 1.0 the finest. The internal code uses an integer "level of detail" in the range [0, 15], and the input to Face.Triangulate is mapped to an integer by dividing the range [0.0, 1.0] into sixteen equal segments (i.e., multiply by 16, round down, and restrict to the range [0.0, 1.0]).

This ends up at the internal function GFace::updateCachedFacets, which also takes various other things into account (whether there's view-specific data, properties of the face in question, etc.).

There's also a version of Face.Triangulate that takes no input and uses a different approach to choose a triangulation granularity.

By design of CustomExporter 3D, the main (only?) factor controlling the quality should be the view node's level of detail, as mentioned above. However, what happens inside Face.Triangulate is separate from CustomExporter. The scale of the view might have something do with any discrepancy between different export workflows. On the other hand, Revit version and the presence of UI should not have an impact, although we cannot completely rule those factors out.

The Revit version can certainly play a role in principle. For example, improvements to face triangulation are made from time to time, and that changes the way certain objects are triangulated in some cases.

We had thought presence or absence of UI would not have an impact for other things, and been proven wrong, e.g., garbage collection and image export.

If it's reproducible, in particular it might be reproducible without Custom Exporter and just running the same small routines on Revit UI and DA of going to one reasonably complex element, finding a target face, and Triangulating, we should look at it as a Problem Report.

Can you provide a minimal reproducible case for the development team to analyse?

Response: I am using the 2020 version of Revit, both on desktop and in DA.

We have now solved our problem.

The solution was simply to set ViewNode.LevelOfDetail to the desired level of detail in IExportContext.OnViewBegin and then collect all the geometry in the IExportContext.OnPolymesh callback instead of the OnFaceBegin callback.

Before this, we were collecting some geometry by getting faces in IExportContext.OnFaceBegin and then triangulating those faces with Face.Triangulate(LOD).

For some reason, the resulting geometry had much higher triangle count when the add-in was run in design automation.

We do not need any more assistance on this.

If the development team wants to reproduce it, here are the basic steps to do so:

The add-in will need to be modified to run in design automation. Modify the code so the value of vertCount can be verified both in the desktop version and DA version. I was expecting the value to be the same both on desktop and in DA, however, in my experience, the model was much more detailed when exporting in DA.

SQL Versus NoSQL

I became a fan of NoSQL while working on the FireRatingCloud project, a multi-RVT-project reimplementation of the FireRating SDK sample.

It forms part of my research connecting desktop and cloud prior to the emergence of Forge and uses a REST API to access a cloud-based NoSQL MongoDB database managed by a node.js web server.

I still remain a fan of NoSQL, even after being intrigued by an article by John Biggs and Ryan Donovan asking, Have the tables turned on NoSQL?

They conclude that NoSQL is "not so great for your side hustle" and that

a consensus has emerged in conferences and blogs that SQL is the gold standard – with a lot of emphasis on PostgeSQL – and you should use it by default, only deviating if you have good reasons to use NoSQL.

I assume they know a lot more than I do in this area, so I guess I should trust their judgement more than mine in this case.

But I naively continue to prefer NoSQL anyway :-)