leszekp Posted October 19, 2013 Report Posted October 19, 2013 Virtual whitening of fossils using polynomial texture mapping http://palaeo-electronica.org/content/2013/465-virtual-whitening-of-fossils I get comparable results using image enhancement, but this looks like it might be a faster and easier way.
marlin Posted October 19, 2013 Report Posted October 19, 2013 Dear Leszekp! thanks for sharing this url! It made for an interesting read on a Sat AM with some coffee and breakie! PTM / RTI really seems like its working well for what you're trying to 'see'. Have you considered documenting the same type of fossilized subject(s) with Photogrammetry? You could get a interesting and detailed examination of the surface topology --- as well as get accurate metrics (measurements) from your models. I'm sure you've heard of Photogrammetry, but incase you're looking for a bit of additional information, this url could wet your appetite. The acquisition gear --- you already own it if you're capturing RTI data, and to shoot a photogrammetric sequence is not that much longer of a time commitment (compared to RTI). Check out the link below. http://culturalheritageimaging.org/Technologies/Photogrammetry/ thanks again for sharing. Please post more of your work! Additional URLs? Marlin.
leszekp Posted October 19, 2013 Author Report Posted October 19, 2013 Hi Marlin, Just want to make clear I'm not directly associated with the work in this paper, found it recently in an online search. I'm interested in fossils, and have shot a few with RTI, but that's strictly avocational. The primary use of my custom-built RTI dome has been for archaeological artifacts, mainly projectile points and other lithics, but a few other types as well; whitening is often used on lithic tools to enhance surface details. I've attached a pair of photos of one item, a Clovis point from Arizona, an original digital photo and an enhanced RTI image that's comparable to what you might get using whitening. I'm presenting on the technique and initial results next week at the Arizona Archaeological Council 2013 Conference in Mesa, AZ (and yes, CHI will get a shout-out). I've been working with photogrammetry for a while now, documenting archaeological digs/features, and medium-sized artifacts. The lithics I've been shooting with RTI are small (1-3 inches typical) and usually have poor contrast and very limited surface topography, so I've considered them poor candidates for photogrammetry. Just tried a test run on one of them, and sure enough, the resulting mesh is very low quality; may try some more experiments in the future if I can find the time. I am working on a SciLab program to take the RTI photo dataset and do photometric stereo analysis on it; this may provide superior normals images, but also can in principle generate high-res surface topography meshes. Best, Leszek
Taylor Posted October 20, 2013 Report Posted October 20, 2013 Really interesting paper, Leszek. The paper states, "A simple command.line program was written in C to read a PTM file, estimate the normal vectors and curvatures, and make a new whitened PTM file." This was mentioned as an alternative to modifying the available PTM software. Is the C program available to share? This paper also reminds me of the XShade software: http://gfx.cs.princeton.edu/proj/xshade/ I think XShade is being incorporated into the new CARE tools being developed by CHI and Princeton. You've probably already seen the paper by Lindsay McDonald about calculating normals by reprocessing existing RTI data sets using photometric stereo triplets that George Bevan mentioned in the post about CARE tools: http://forums.culturalheritageimaging.org/index.php?/topic/220-care-tool-and-surface-normals/&do=findComment&comment=447 Best, Taylor
leszekp Posted October 20, 2013 Author Report Posted October 20, 2013 I emailed one of the authors asking about software availability, but no reply to date. The technique appears to use fairly-straightforward algorithms - maybe something that could be added to a future version of RTIBuilder/Viewer? Thanks for the references. I did see the McDonald paper, but AFAIK that software hasn't been made publicly available, either; would love to try it out. Writing a comparable program for triplets is a bit beyond my current abilities, so I'll start out with normals from the full dataset, which is a lot easier. XShade looks interesting - will have to give that a try with some of my 3D models.
cdschroer Posted October 24, 2013 Report Posted October 24, 2013 On a related note: RTIViewer and RTIBuilder along with the Hemispherical Harmonics (HSH) fitter are open source under the Gnu General Public License Version 3. So, folks can get the code and do this kind of thing themselves. I'll note that the ptmfitter is HP proprietary, though the ptm file format was published by HP as an "Open" format, which means anyone can write it or read from it with other tools. RTI file format is open also, and the spec is made available to anyone that asks. It was done as part of a grant funded research project coordinated by Cultural Heritage Imaging and involving James Davis and graduate students at UC Davis, along with Tom Malzbender of HPLabs (at the time), and folks from Italian National Research Council's (CNR) Institute for Information Science and Technology's (ISTI) Visual Computing Laboratory (http://vcg.isti.cnr.it). We didn't have funding to take it to a standards body, but we do have a document. Write to us if you want it. Carla
ohammer Posted March 17, 2015 Report Posted March 17, 2015 Hi, I'm the author of that paper. I would be very happy to contribute to including "virtual whitening" in RTIViewer. But I'm a bit wary of having to learn the RTIViewer source code conventions and data structures. If anyone out there has the knowledge about the source code and could help out, then let me know! (Because of the inconvenience and bugginess of my own virtual whitening script, we have now returned to physical whitening with ammonium chloride at our museum. The smoke is horrible and the specimens get damaged, it's awful)
Taylor Posted March 17, 2015 Report Posted March 17, 2015 This paper appears to provide a robust path toward more accurate estimation of surface normals, surface colours, and albedos for situations where high specularity (non-matte surfaces) or self-shadowing due to high relief is a problem (thanks to Tom for pointing me to it): http://www.faculty.idc.ac.il/toky/Publications/journal/ivc2012.pdf I don't know if the more robust methods described in that paper can be built easily into simple GUIs such as RTIBuilder and RTIViewer, but an alternative to chemical whitening seems needed.
cdschroer Posted March 17, 2015 Report Posted March 17, 2015 I'm glad to see this topic revived. I think Ovind's work is really interesting and useful! We aren't actively working on the RTIViewer right now, but if folks want the code in order to add this, it is available under the GNU General Public License version 3. We would look at bringing it back into the main RTiViewer code for any future releases. Ovind has offered to share what he has. As for the work that Taylor points to from Mark Drew et al at Simon Fraser University, we have worked with them to make the better normal calculation more available. Note that their work has several parts, and we felt the more accurate normals was the highest priority to try to release. This would need to be done in a fitter and therefore it best fits into the RTiBuilder software. Some work has taken place to create a new tool that could be called from RTiBuilder. The initial version would produce 2 types of normal maps, and not RTI or PtM files. Incorporating these results into RTI files would be a second step, not currently funded. Actually completing this work isn't funded either, but we all want to see it get out there. Here is a bit more recent paper, actually the masters thesis of Mingjing Zhang, that covers their most recent thinking on this: http://summit.sfu.ca/item/13719 Carla
Recommended Posts
Archived
This topic is now archived and is closed to further replies.