Jump to content

RTI Underwater: a research project. University of Southampton


rtiunderwater
 Share

Recommended Posts

The bulb was inside a reflective holder.  I have considered that a possibility in the laboratory tests coming up.  I dont want any motion of the water surface to create any form reflective mirroring. (I am only familiar with this concept in sonar imaging, not photographic imaging, but I imagine there is some similar issues there.)

 

 In all honesty, Dennis the data capture was not perfect.  There clearly was some motion detected in the images...but only a few of them.  I was going to sit at the monitor and sort through them and edit out the ones where there was motion when I transitioned back and forth between them, but one of my technical advisors helped me run them through a PhotoShop alignment tool.  It removed virtually all to issues and when we ran the second .ptm file it came out so good we were all ecstatic there in the computer lab.  However, what I later realized was... if the camera moves a few pixels from either water current I caused by swimming around it, or perhaps even physical contact I made against that ridiculous mounting bracket I invented on short notice (by bumping it slightly with the light while trying to hold my position)... that type of error can easily be removed with the alignment tool because the pixels in the pic do not move relative to each other.  However, the blur that remained is just one or two spots where I believe I may have caused movement either to the ball itself (which by the way is far less stable underwater sitting on it's little metal nut I glued to it with a British coin onto it as well) than one might think.  It did NOT take much of a swoosh to move the artifact or the ball... or possibly I moved the actual iron cylindrical base plate i used to hold up the artifact and tie a string to it.  Since those pixels are now off relative to the others in the set of pics, they could not be aligned.  It is ever so minor.. but present.  I learned a lot from the first attempt.  The prototype testing device I am building in a couple weeks will be an acrylic 8-armed 'octopus' that holds the camera mounted at the top, center, and pointing down.  Mounted to the arms will be LED's (four to an arm) and powered from the surface for a laboratory test tank environment.  Then… when I go to the field, I will pull the LED's off it, but still use it as a 'light position template' and camera holder.  I can dive down with the octopus, position it over an artifact... set the camera and then let it start firing away exposures… and all i have to do is move the light parallel to and in-between each acrylic arm.  It will require no string or contact with neither the object nor the camera at any time.  I expect very good results.  Also, the light I used had no diffuser on it.  It was a very bright and fairly focused beam.  I think I can put a diffuser over it, reduce the output slightly, slow down the shutter speed to compensate, and get a much more even light coverage on the object.  Already my professors are talking 'video cameras and wifi and real-time processing'... but I will be most happy at this stage just getting the process down to duplicate positive results consistently in a tank of water and with a tank on my back!. :)  Thank you for your positive feedback.



 

  • Like 1
Link to comment
Share on other sites

 the ball itself... is far less stable underwater sitting on it's little metal nut I glued to it with a British coin onto it as well) than one might think. 

 


Yeah the difference between weight in air and weight in water takes some getting used to when you want things to not only sit on the seafloor but also stay in place. I once wasted a lot of time making sure every component of a (non-RTI) experimental design was negatively buoyant only to see it slide about on the seafloor. I learned to bring a lot of large lead weights!

Link to comment
Share on other sites

https://drive.google.com/folderview?id=0B_bmKMWOjrbEMnNpMm1zcmIxZEk&usp=sharing

 

Ok... the research design for Underwater Reflectance Transformation Imaging (URTI) research is completed.  The proposal is sitting in the google drive folder above. (I cited George :)  The file name is ProposalFINAL.pdf   It is due in 45 minutes... :(  so It has already been sent in as is.  This is what I will be doing for the remainder of the summer.  

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...

I do know there is something not quite right.  1st, there was movement.  We know this.  The Photoshop alignment tool took care of much of it, but not all of it. The question of whether the movement was due to the camera mount being jolted or the object being moved slightly while I moved the srting is uncertain.  Next, the object is round on the back side and would not remain still underwater when I set it on the back side in order to RIT the face you see.  I did not realize how buoyant it actually was underwater.  I ended up punching a hole through the white duct tape that covered the metal cylinder on which it was sitting on, to allow the curvature of the 'deadeye'  I was literally in a bit of in a panic at that moment because I had not planned at all for my object to be so completely unstable under water.  So to be honest, I don't know if I set the object correctly and perfectly level. (It is like a rocking chair rail and I don't know if the front and back are actually in the same plane!) But I don't know if any of this is what caused it. I have concluded though that these were 'operator malfunctions' and not an issue with URTI viability in general.  I have countless more trials coming up in the next two months to try and get flawless results. 

 

I do have some good news though.  Mr. Kirk Martinez and a research fellow named Phil Basford of the Electronics and Computer Science department helped me design a switching circuit board and an Arduino controller to automate 32 LEDs that will be fixed to my acrylic underwater light octopus/dome.   The parts are on order and I will go into the University electronics workshop where I have 'unlimited pic of the soldering iron of my choice' to build it in a couple of weeks. 

Link to comment
Share on other sites

  • 2 weeks later...

I have completed the prototype of the first 'Underwater RTI Light Dome'

 

A HUGE thank you to Professor Mark Jones,  Master carpenter Mr. Dennis Cook,  Simon, and my new friends at the Mary Rose Museum in Portsmouth, UK for allowing me to build this in their workshop!

 

Pictures of it in the Google Drive folder:

 

https://drive.google.com/folderview?id=0B_bmKMWOjrbEYjB2V0Vna1ctcTg&usp=sharing
 

 

It was made entirely of acrylic cut with a laser.  It essentially has a focal length of 400 mm (inside radius), and a total base ring OD of 880 mm.  Each leg features position for four LEDs.  It will hold both my $600 'point and shoot' auto-focusing digital camera and the department's $8000 worth of high-end underwater Nikon SLR equipment. I designed it to be completely modular, although to make it firm enough would have required additional bracketing.  I ran out of time at the workshop to make these additional brackets.  So unfortunately, it will be completely fused together with acrylic solvent/glue prior to wiring with LEDs.  

 

(@ George:  I didn't fully understand what you meant by 'surface normal visualization.'  Somebody showed me the RTIBuilder codes to display it and we took a look though. One of my PhD student advisors seemed to think it was due to shadows/reflectivity of the pool surface being tile.  ??  maybe?)

 

@ Marlin.. the CHI RTI kit arrived!  Thank you.  Now to start modifying the reflector balls so I can place them on target in the open sea.  I'm thinking just gluing a nut on to receive a piece of threaded rod for each.  Then I can just push the rod down into the sand/silt at the same relative height as the object to be photographed.)

  • Like 3
Link to comment
Share on other sites

David, 

 

If you use the HP Viewer you can select the filter "Surface Normal Visualization". It's a very good way to determine whether a PTM was successful in estimating the surface normals. This filter is not, at present, available on the RTIViewer but may well be included in a future release. I suspect shadowing is very likely the cause of the irregularities I observed. 

 

Your new dome looks really nice!

Link to comment
Share on other sites

  • 2 weeks later...

@George: Thank you.  We go into the electronics lab next week and begin wiring it.  Hopefully in just a few days it will be completed and then I can begin some flume tank studies.  As it turned out, the department underwater SLR camera does NOT have 'interval shooting' capability.  This means that I can not just set it on an 'auto shoot' feature and let it roll while backing away from it to move the light underwater.  In short.. it is USELESS for this research.  Big blow to my efforts because it means for underwater data gathering I don't have a manual focus SLR to rely on.  Although I am pioneering auto-focus digital camera methodology (the kind of cameras MOST diving enthusiasts will possess) it doesn't add to 'ease of research' in the slightest! 

 

In the mean time, this past week I dove on the HMS Invincible.  She was a 74 gun revolutionary ship design...faster, sleeker... built by the French in 1744 and captured by the British and forced into service.  For 255 years she has rested on the bottom of the Solent in c. 8 m of water. Her huge planking was fixed to the ship frame work with 'wedged trenails',  These are split doweling driven through bore holes with a mallet and pinned with a wedge. She ran aground on a sandbar in the Solent in 1758.  It is a rather huge debris field at this point with some remarkable structure still remaining.  I dove on her with an auto focusing camera, a Benbo Traker 3 tripod, and a 1000 lumen HID dive light to capture the first ever PTM file from a submerged archaeological site. 

 
The reflectance ball from my CHI RTI kit was threaded to the rod, placed near a trenail, and a soft pack lead weight placed atop to hold it firmly. In the PTM file you can see the longitudinal grain pattern in the wedge, c. 1 mm apart and the radial grain in the trenail. This PTM file was created from a dataset captured in c. 8 meters of water with a half knot tidal current.  It doesn't sound like a lot of current, but that is water moving at approximately 1 foot per second.  It is enough current to blow an inexperienced diver off the site and clearly 'less than ideal' for attempting URTI!  The trenail was located on a beautiful section of clean exposed planking. I was surrounded on all sides by smooth planking so there was almost nothing to hold on to.  I was literally hanging onto a single little plastic yellow marker tag with a single finger tip desperately trying not to get blown into my tripod!   Lessons learned:  more lead in my pocket and place tripod up-current side so you are getting blown AWAY from it next time..haha.  Anyway... the amount of sand and debris blowing by the subject during photography was enormous. 
 

'We were on a rising tide. For Invincible this is not the best time
because the water is flooding in from the Solent, therefore bringing dirtier
water. When the water flows from the east on a falling tide it brings in clearer
water from the channel.' Dan Pascoe, Project Director

 

But with underwater photography, we get less backscatter in the image when we fire the lighting from an oblique...which of course is the lighting nature of RTI to begin with... so I was very pleased that:


1) It appears there was absolutely no movement in the photos, even with the current (how this can be I have NO
idea!  But the camera didn't move.)


2) little interference from the debris in the water


Bottom line... this was once again a 'pull it out your back side impromptu' production with all the wrong gear etc etc.  1) The HID light is wrong.. can't get the right beam pattern across the subject because it is not a solid light pattern.. it has hot spots and warm spots in the beam.  Instead we must use a really bright LED dive light, something I don't have at the moment. 2) The camera is wrong (we have no underwater equipment in the university than can do URTI... so once again I am shooting with my little auto focus camera). 3) The mounting is wrong... instead of a stable platform like my URTI acrylic light dome, I instead used a commercial tripod that had two legs in the way that I had to
dodge around with the light.

 

I've uploaded the PTM file to the following link
https://docs.google.com/file/d/0B_bmKMWOjrbEU1g2NnltZlNvV2c/edit?usp=sharing

Please keep in mind that other than it not moving it's kinda crappy because of the inconsistency with the light.  It is not because I could not accurately
control the position in freehand, but rather because the HID dive light just doesn't throw an even beam at only a few tens of centimeters away from the subject.  (If I were to aim it across the parking lot at night, it would light up the tree tops like a bright even spot light!  But up close like that, it has super hot and super cold spots in the cone.)  I literally tried to pick a part of the light cone itself that was reasonably consistent.  That totally adds a non practical side to the already tricky free handing approach. 
So as you move the light around in RTI Viewer it will be 'blotchy.'  I am sure if you look at the Surface Normal Visualization it will be 'off the charts crazy' because there is not even light. 

 

Even with all this working against us, absolute diagnostics are more than possible even with this less than perfect deliverable.  The 250 lumen LED I am using topside for tests (the one I normally strap to my mask!) is not bright enough underwater. I will purchase a good bright hand held LED before July 16-17. I've been invited back to continue URTI work on Invincible on those days.  Between that and my trip to Spain to survey the 17th century British war ship off Cataluña and the 1st century Roman amphora pile (shipwreck) in 30 meters of water there as well, I think there will amazing opportunities to refine this and get some really cool PTMs for presentation here the first week of August or so.  The work continues!

 

Cheers,

Dave
 

  • Like 2
Link to comment
Share on other sites

  • 2 weeks later...

Six days of wiring the URTI light dome.  It is completed with a few minor things left to do.  Basically, an Arduino controller is driving a custom switching board that is controlling 32 LEDs, 4 to a leg, 8 legs.  The wiring has been enclosed in 19mm clear heat shrink tubing and sealed on one end for immersion.  The dome holds both an SLR underwater housing and my trusty little Fuji point-n-shoot.  In its current configuration it will only be deployed in a tank of water for turbidity experiments so the power supply and control electronics does not require waterproofing.  I've taken all the low res. JPEGS from Facebook posts and dumped them in a folder for your review.  If you click on the first one, open it up...and then just scroll with the right arrow... it more or less tells a visual story of the process.  Today my Arduino card got 'stuck' in a glitch and caused 'LED 1, leg 1' to remain continuously on instead of following through the 1 second per light program.  This happened, of course while I was on a coffee break.  I came back just in time to see the 5000 K bright white LED a royal blue in the moments before it went POOF and melted down and sent up a smoke puff. haha.  Couple hours later I had it repaired.  I am awaiting a newer (better) Arduino board to arrive in the mail, hopefully this Wednesday.  The camera is an old Nikon D300.  It is 12mpix.  I've wired a custom three pin connector to it that will be soldered into one of the Arduino terminals enabling me to drive the camera in synch with the lights, This will be a nice feature for the experiments. 

 

https://drive.google.com/folderview?id=0B_bmKMWOjrbEQkhxR283aE1sb2c&usp=sharing
 

  • Like 2
Link to comment
Share on other sites

  • 3 months later...

October 10, 2013

 

The URTI research project was a tremendous success, in part due to some of the fantastic input I received from individuals from this blog.  I was able to generate PTM files from an 18th century wooden wreck in the cold turbid current of the Solent and PTM files off a 1st Century BC Roman shipwreck in the Western Mediterranean.  Both sets of field PTMs offered diagnostic resolution in archaeological wood.  In one particular PTM, I was able to isolate individual ‘tool carving planes’ in a mark on a floor timber.  This diagnosis was of particular value to the Spanish maritime archaeologists studying the shipwreck.  It showed them… that…well…we ‘know that we know that we know that somebody PUT these marks on this particular timber with a bladed tool.’  

 

Controlled laboratory turbidity experiments in this masters research proved to be extremely challenging for a variety of reasons (discussed in detail in the dissertation).  I had to attempt the experiment three complete times to pull it off. Each time required many hours of laboratory preparation at the National Oceanography Center Sediment Analysis Laboratory.  I almost gave up on
it, having already achieved more than enough for high scores in a masters dissertation with the shipwreck PTMs. However, I was ‘strongly encouraged’ by my advisors that with regards to the turbidity objectives of the dissertation, that ‘failure…is not an option’ ...haha.  In the end, I was able to shoot 16 pixel-registered PTM’s of a piece of Samian ware (Roman pottery shard) underwater in our test tankn using the fully automatic and fully submersible fixed lighting dome I built for the research. Between each PTM I varied the turbidity with the addition of one gram of Bentonite powdered clay between each PTM.  

 

The results were fascinating.  I was able to mathematically demonstrate that the amount of progressive ‘noise’ in the source JPEG images was anywhere from 1.5 to 2.5 times higher than the amount of noise generated in the PTM normal renders for each of the associated 16 PTMs.  In other words… the PTMs proved to be far more robust in their ability to accurately render the image under progressivey turbid conditions than the very source images used to generate the PTMs.  (I believe there is some ‘averaging’ going on in the bi-quadratic equation generating the PTMs that is producing clearer results.)   This is an empirical result and one we will be publishing shortly.

 

I am a firm believer in sharing both the data and dissertation with all interested parties and will make it accessible here on this CHI blog with links as soon as the URTI publications are in the pipeline.  We are working on the publications right now.  I would anticipate being able to provide the link to this body of research material as early as November.



Again, thank you all for your interest and input.

 

Sincerely,


Dave Selmo

  • Like 3
Link to comment
Share on other sites

  • 3 months later...

A quick note about space on this forum site:

 

We are using a hosting service for the forum, and the amount of disk space we get is pretty small.  That is why we restricted how much each user can upload. The advantages of the hosted site are that the company keeps security patches installed, and helps us quickly resolve any technical issues.  We could upgrade to a more expensive package, but this package fits our traffic and number of users, other than the space limitations.  We do encourage folks to post information and put links here.  For example, we post pdf files and images to our website or to Flickr and then put links here.  I know it isn't ideal, but our primary goal with the forums was a place to allow conversation and people informing and helping each other.

 

I hope this clarifies why we have the limits we have.

 

Carla

Link to comment
Share on other sites

  • 3 years later...
  • 3 years later...
  • 4 months later...
On 3/24/2013 at 8:04 PM, GeorgeBevan said:

Interesting project, David. I have thought of doing RTI underwater but there are formidable technical challenges in deploying the highlight method underwater, not least stirring up sediment. I'd suggest that a fixed dome+camera system, possibly mounted on an ROV would be ideal. The movement as the ROV hovers could be compensated for these top legal steroids at gnc by post-processed alignment. Using photometric stereo rather than RTI to process the data-set may also be advantageous as you'll need many fewer images than for PTM/HSH,.

 

Alternatively, I think the best way to do what you want is to the shoot the photogrammetry (preferably using ADAMTech) and then create a "Virtual RTI" in Blender or another rendering project. The reason I suggest ADAMTech is that it will generate vastly more points than any other package...the calibration of the camera needs to be very precise to get the good stereo alignment to generate points. 

 

One significant application we've seen for this kind of surface metrology underwater is understanding machining or woodworking marks of vessels (provided there isn't a huge amount of corrosion).

 

I work extensively with the underwater archaeology unit at Parks Canada...they may be interested as well in some of the applications you have in mind. 

RTI's only “deliverable” is what end- users can see with the naked eye. RTI consistently presents more surface detail than conventional 

And I know it is very hard to predict the future of this but I am very thankful to you for if you guide me better

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

×
×
  • Create New...