Archaeological Artefact Studys

Hi I am an archaeologist experimenting with photogrammetry and 3d modeling, I came across your software while trying to find ways to interrogate my models for dimension info. I was wondering if there was a way to change the units of measurement within your application to match my models. They are taken in meters is there a way to change or amend the scale within the app?

Also is there a way to cross section a 3D model, essentially doing the reverse of the segment approach of your software.

If you know for a fact that the units of your 3D models are in meters, then you can set the default units from millimeters to meters in Slicer (Edit->Application Settings->Units).

Unfortunately not all modules of Slicer supports this custom units. So your mileage may vary.

Another approach that will guaranteed to work, is to create a meter to millimeter transformation (go to transforms module, create a new transform called m2mm, and set the first three diagonals values to 1000). You can then assign your 3D model to this transform, and then when you measure distances they should be reported in correct mm values.

Thanks so much. That makes a lot of sense, and I can give that a try. You guys are quick at getting back to me!

Can you please elaborate? I’m trying to imagine the reverse of segmentation but it gets me :slight_smile: The only thing that comes to mind from cross section are the 2D (MPR) views, but you can already see those.

This is an interesting use case. If it’s not an inconvenience I’d love to hear a bit about the project.

Hi, of course, I can elaborate. I’m not sure I’m getting it right, but I assume the software builds a 3D model from medical scan data of some sort. I am essentially starting from the end. I have built 3D scans of archaeological artefacts, namely flint tools, and would like to be able to view them in cross-section if that makes sense. I am mainly trying to gain as much dimensional info about the artefacts as possible to help determine if physical dimensions are a factor in the tools selected by early humans.

I could be getting this all wrong, but if I am, please let me know, and thanks for getting back to me.

1 Like

Can you share at least one of your models so that people here may better understand your workflow?

To see the cross-sections of a 3D model in slices, all you have to do is to enable the slice views:

What modality these 3D models are generated from? If something like photogrammetry or other surface scanning, there wouldn’t be any internal structure to see, simply a shell.

If they were CT scanned somewhere, and then there will be some internal detail, but most of the information will be lost during the volumetric segmentation to generation of closed-surface representation. You can expect to see something like:

Hi,
That’s great to know. My models were built using photogrammetry and are of stone artefacts, so there’s no internal structure per se, but the cross-sections may give me a more visual representation of the sharp cutting edges vs. the blunt or hafted sides, etc. I had loaded models in already but couldn’t see a cross section in those panels, is there something I need to enable?

Thanks,
Gavin.

Sorry, how would I share the model, just an image of it in the app or through the model itself?

So you basically have STLs (or OBJs or similar)? It is complicated to get cross-sections of models without images, as Slicer’s assumption is that every workflow starts with an image. There have been other topics here on the forum about this, but I think there was no good solution.

A simple hack that I suggest is:

  • Load CTChest from Sample Data module
  • Volume render it
  • Enable interaction on the CTChest node (right-click eye and choose Interaction)
  • Move it so that it fully overlaps your model
  • Window/level so that it’s all black (to remove confusing medical data)
  • Now you can slice away

Let us know if this is good enough or you need higher resolution or if you encounter other problems.

Sharing one would be useful. I suggest dropping a google drive link or whatever cloud storage you use.

Another option is to load a copy of the STL as a segmentation object, and then convert the segmentation to a labelmap. That will give you a volume and slices that you can interact…

We have a tool in SlicerMorph to combine those two steps. I haven’t looked at it for a while, but it is called ImportSurfaceToSegment.

1 Like

Thanks for that, I’ve had a play around with what you suggested but think I’m missing a step or something and can’t quite get it to work as I’m not familiar with the software. I have added a couple of models below so that you can have a look. I’ll keep trying though.

AE0259 AREA N D4 855-5448 BLADE.stl - Google Drive, AE0259 AREA N D4 855-5448 BLADE.obj - Google Drive

Thanks for this I’ll have a go at attempting it!

In ‘Application settings / Units’, if you select in ‘Advanced options’ the ‘Meter’ preset for ‘Length’ and set a coefficient of 1000, measurements appear in ‘m’ as below.

I don’t know if that’s what you require.

This is what your data looks like if it is loaded as is into Slicer:

This is my suggested method of creating a scaling transform (m2mm) and assign the object to it:

This is what @chir.set suggested by modifiying the coefficients field of the Units settings to 1000, but keeping the unit as mm (this is actually works better than my suggestion changing the unit to meters)

Each of these have their own drawbacks. In the first one you can mentally do the change by multiplying the reported values by 1000. In second one, you have to remember to put the object under this transform every time you want to take a measurement. In the third one, every dataset that will be loaded will have their coordinates multiplied by 1000 (not just your data).

So which one is the optimum solution for you depends on what you want to do.

Hi, Sorry I’m late responding your message, this is interesting how can I load the stl as a segmentation object?

Thanks very much for your detailed response. I shall try them out and see what works best. You’ve all been very helpful, and it’s great to have a community like this.