Jump to content


  • Content count

  • Joined

  • Last visited

  • Days Won


rtiunderwater last won the day on June 20 2014

rtiunderwater had the most liked content!

Community Reputation

21 Excellent

About rtiunderwater

  • Rank

Contact Methods

  • Skype

Profile Information

  • Gender
  • Location
    Southampton, UK / Knoxville, TN
  • Interests
    Contact me at: mos11b1p@hotmail.com
  1. October 10, 2013 The URTI research project was a tremendous success, in part due to some of the fantastic input I received from individuals from this blog. I was able to generate PTM files from an 18th century wooden wreck in the cold turbid current of the Solent and PTM files off a 1st Century BC Roman shipwreck in the Western Mediterranean. Both sets of field PTMs offered diagnostic resolution in archaeological wood. In one particular PTM, I was able to isolate individual ‘tool carving planes’ in a mark on a floor timber. This diagnosis was of particular value to the Spanish maritime archaeologists studying the shipwreck. It showed them… that…well…we ‘know that we know that we know that somebody PUT these marks on this particular timber with a bladed tool.’ Controlled laboratory turbidity experiments in this masters research proved to be extremely challenging for a variety of reasons (discussed in detail in the dissertation). I had to attempt the experiment three complete times to pull it off. Each time required many hours of laboratory preparation at the National Oceanography Center Sediment Analysis Laboratory. I almost gave up on it, having already achieved more than enough for high scores in a masters dissertation with the shipwreck PTMs. However, I was ‘strongly encouraged’ by my advisors that with regards to the turbidity objectives of the dissertation, that ‘failure…is not an option’ ...haha. In the end, I was able to shoot 16 pixel-registered PTM’s of a piece of Samian ware (Roman pottery shard) underwater in our test tankn using the fully automatic and fully submersible fixed lighting dome I built for the research. Between each PTM I varied the turbidity with the addition of one gram of Bentonite powdered clay between each PTM. The results were fascinating. I was able to mathematically demonstrate that the amount of progressive ‘noise’ in the source JPEG images was anywhere from 1.5 to 2.5 times higher than the amount of noise generated in the PTM normal renders for each of the associated 16 PTMs. In other words… the PTMs proved to be far more robust in their ability to accurately render the image under progressivey turbid conditions than the very source images used to generate the PTMs. (I believe there is some ‘averaging’ going on in the bi-quadratic equation generating the PTMs that is producing clearer results.) This is an empirical result and one we will be publishing shortly. I am a firm believer in sharing both the data and dissertation with all interested parties and will make it accessible here on this CHI blog with links as soon as the URTI publications are in the pipeline. We are working on the publications right now. I would anticipate being able to provide the link to this body of research material as early as November. Again, thank you all for your interest and input. Sincerely, Dave Selmo
  2. Six days of wiring the URTI light dome. It is completed with a few minor things left to do. Basically, an Arduino controller is driving a custom switching board that is controlling 32 LEDs, 4 to a leg, 8 legs. The wiring has been enclosed in 19mm clear heat shrink tubing and sealed on one end for immersion. The dome holds both an SLR underwater housing and my trusty little Fuji point-n-shoot. In its current configuration it will only be deployed in a tank of water for turbidity experiments so the power supply and control electronics does not require waterproofing. I've taken all the low res. JPEGS from Facebook posts and dumped them in a folder for your review. If you click on the first one, open it up...and then just scroll with the right arrow... it more or less tells a visual story of the process. Today my Arduino card got 'stuck' in a glitch and caused 'LED 1, leg 1' to remain continuously on instead of following through the 1 second per light program. This happened, of course while I was on a coffee break. I came back just in time to see the 5000 K bright white LED a royal blue in the moments before it went POOF and melted down and sent up a smoke puff. haha. Couple hours later I had it repaired. I am awaiting a newer (better) Arduino board to arrive in the mail, hopefully this Wednesday. The camera is an old Nikon D300. It is 12mpix. I've wired a custom three pin connector to it that will be soldered into one of the Arduino terminals enabling me to drive the camera in synch with the lights, This will be a nice feature for the experiments. https://drive.google.com/folderview?id=0B_bmKMWOjrbEQkhxR283aE1sb2c&usp=sharing
  3. @George: Thank you. We go into the electronics lab next week and begin wiring it. Hopefully in just a few days it will be completed and then I can begin some flume tank studies. As it turned out, the department underwater SLR camera does NOT have 'interval shooting' capability. This means that I can not just set it on an 'auto shoot' feature and let it roll while backing away from it to move the light underwater. In short.. it is USELESS for this research. Big blow to my efforts because it means for underwater data gathering I don't have a manual focus SLR to rely on. Although I am pioneering auto-focus digital camera methodology (the kind of cameras MOST diving enthusiasts will possess) it doesn't add to 'ease of research' in the slightest! In the mean time, this past week I dove on the HMS Invincible. She was a 74 gun revolutionary ship design...faster, sleeker... built by the French in 1744 and captured by the British and forced into service. For 255 years she has rested on the bottom of the Solent in c. 8 m of water. Her huge planking was fixed to the ship frame work with 'wedged trenails', These are split doweling driven through bore holes with a mallet and pinned with a wedge. She ran aground on a sandbar in the Solent in 1758. It is a rather huge debris field at this point with some remarkable structure still remaining. I dove on her with an auto focusing camera, a Benbo Traker 3 tripod, and a 1000 lumen HID dive light to capture the first ever PTM file from a submerged archaeological site. The reflectance ball from my CHI RTI kit was threaded to the rod, placed near a trenail, and a soft pack lead weight placed atop to hold it firmly. In the PTM file you can see the longitudinal grain pattern in the wedge, c. 1 mm apart and the radial grain in the trenail. This PTM file was created from a dataset captured in c. 8 meters of water with a half knot tidal current. It doesn't sound like a lot of current, but that is water moving at approximately 1 foot per second. It is enough current to blow an inexperienced diver off the site and clearly 'less than ideal' for attempting URTI! The trenail was located on a beautiful section of clean exposed planking. I was surrounded on all sides by smooth planking so there was almost nothing to hold on to. I was literally hanging onto a single little plastic yellow marker tag with a single finger tip desperately trying not to get blown into my tripod! Lessons learned: more lead in my pocket and place tripod up-current side so you are getting blown AWAY from it next time..haha. Anyway... the amount of sand and debris blowing by the subject during photography was enormous. 'We were on a rising tide. For Invincible this is not the best time because the water is flooding in from the Solent, therefore bringing dirtier water. When the water flows from the east on a falling tide it brings in clearer water from the channel.' Dan Pascoe, Project Director But with underwater photography, we get less backscatter in the image when we fire the lighting from an oblique...which of course is the lighting nature of RTI to begin with... so I was very pleased that: 1) It appears there was absolutely no movement in the photos, even with the current (how this can be I have NO idea! But the camera didn't move.) 2) little interference from the debris in the water Bottom line... this was once again a 'pull it out your back side impromptu' production with all the wrong gear etc etc. 1) The HID light is wrong.. can't get the right beam pattern across the subject because it is not a solid light pattern.. it has hot spots and warm spots in the beam. Instead we must use a really bright LED dive light, something I don't have at the moment. 2) The camera is wrong (we have no underwater equipment in the university than can do URTI... so once again I am shooting with my little auto focus camera). 3) The mounting is wrong... instead of a stable platform like my URTI acrylic light dome, I instead used a commercial tripod that had two legs in the way that I had to dodge around with the light. I've uploaded the PTM file to the following link https://docs.google.com/file/d/0B_bmKMWOjrbEU1g2NnltZlNvV2c/edit?usp=sharing Please keep in mind that other than it not moving it's kinda crappy because of the inconsistency with the light. It is not because I could not accurately control the position in freehand, but rather because the HID dive light just doesn't throw an even beam at only a few tens of centimeters away from the subject. (If I were to aim it across the parking lot at night, it would light up the tree tops like a bright even spot light! But up close like that, it has super hot and super cold spots in the cone.) I literally tried to pick a part of the light cone itself that was reasonably consistent. That totally adds a non practical side to the already tricky free handing approach. So as you move the light around in RTI Viewer it will be 'blotchy.' I am sure if you look at the Surface Normal Visualization it will be 'off the charts crazy' because there is not even light. Even with all this working against us, absolute diagnostics are more than possible even with this less than perfect deliverable. The 250 lumen LED I am using topside for tests (the one I normally strap to my mask!) is not bright enough underwater. I will purchase a good bright hand held LED before July 16-17. I've been invited back to continue URTI work on Invincible on those days. Between that and my trip to Spain to survey the 17th century British war ship off Cataluña and the 1st century Roman amphora pile (shipwreck) in 30 meters of water there as well, I think there will amazing opportunities to refine this and get some really cool PTMs for presentation here the first week of August or so. The work continues! Cheers, Dave
  4. I have completed the prototype of the first 'Underwater RTI Light Dome'. A HUGE thank you to Professor Mark Jones, Master carpenter Mr. Dennis Cook, Simon, and my new friends at the Mary Rose Museum in Portsmouth, UK for allowing me to build this in their workshop! Pictures of it in the Google Drive folder: https://drive.google.com/folderview?id=0B_bmKMWOjrbEYjB2V0Vna1ctcTg&usp=sharing It was made entirely of acrylic cut with a laser. It essentially has a focal length of 400 mm (inside radius), and a total base ring OD of 880 mm. Each leg features position for four LEDs. It will hold both my $600 'point and shoot' auto-focusing digital camera and the department's $8000 worth of high-end underwater Nikon SLR equipment. I designed it to be completely modular, although to make it firm enough would have required additional bracketing. I ran out of time at the workshop to make these additional brackets. So unfortunately, it will be completely fused together with acrylic solvent/glue prior to wiring with LEDs. (@ George: I didn't fully understand what you meant by 'surface normal visualization.' Somebody showed me the RTIBuilder codes to display it and we took a look though. One of my PhD student advisors seemed to think it was due to shadows/reflectivity of the pool surface being tile. ?? maybe?) @ Marlin.. the CHI RTI kit arrived! Thank you. Now to start modifying the reflector balls so I can place them on target in the open sea. I'm thinking just gluing a nut on to receive a piece of threaded rod for each. Then I can just push the rod down into the sand/silt at the same relative height as the object to be photographed.)
  5. I do know there is something not quite right. 1st, there was movement. We know this. The Photoshop alignment tool took care of much of it, but not all of it. The question of whether the movement was due to the camera mount being jolted or the object being moved slightly while I moved the srting is uncertain. Next, the object is round on the back side and would not remain still underwater when I set it on the back side in order to RIT the face you see. I did not realize how buoyant it actually was underwater. I ended up punching a hole through the white duct tape that covered the metal cylinder on which it was sitting on, to allow the curvature of the 'deadeye' I was literally in a bit of in a panic at that moment because I had not planned at all for my object to be so completely unstable under water. So to be honest, I don't know if I set the object correctly and perfectly level. (It is like a rocking chair rail and I don't know if the front and back are actually in the same plane!) But I don't know if any of this is what caused it. I have concluded though that these were 'operator malfunctions' and not an issue with URTI viability in general. I have countless more trials coming up in the next two months to try and get flawless results. I do have some good news though. Mr. Kirk Martinez and a research fellow named Phil Basford of the Electronics and Computer Science department helped me design a switching circuit board and an Arduino controller to automate 32 LEDs that will be fixed to my acrylic underwater light octopus/dome. The parts are on order and I will go into the University electronics workshop where I have 'unlimited pic of the soldering iron of my choice' to build it in a couple of weeks.
  6. University blog. Nothing new. They just wanted it documented. http://blog.soton.ac.uk/cma/blog/2013/05/underwater-reflectance-transformation-imaging-a-success/
  7. https://drive.google.com/folderview?id=0B_bmKMWOjrbEMnNpMm1zcmIxZEk&usp=sharing Ok... the research design for Underwater Reflectance Transformation Imaging (URTI) research is completed. The proposal is sitting in the google drive folder above. (I cited George The file name is ProposalFINAL.pdf It is due in 45 minutes... so It has already been sent in as is. This is what I will be doing for the remainder of the summer.
  8. The bulb was inside a reflective holder. I have considered that a possibility in the laboratory tests coming up. I dont want any motion of the water surface to create any form reflective mirroring. (I am only familiar with this concept in sonar imaging, not photographic imaging, but I imagine there is some similar issues there.) In all honesty, Dennis the data capture was not perfect. There clearly was some motion detected in the images...but only a few of them. I was going to sit at the monitor and sort through them and edit out the ones where there was motion when I transitioned back and forth between them, but one of my technical advisors helped me run them through a PhotoShop alignment tool. It removed virtually all to issues and when we ran the second .ptm file it came out so good we were all ecstatic there in the computer lab. However, what I later realized was... if the camera moves a few pixels from either water current I caused by swimming around it, or perhaps even physical contact I made against that ridiculous mounting bracket I invented on short notice (by bumping it slightly with the light while trying to hold my position)... that type of error can easily be removed with the alignment tool because the pixels in the pic do not move relative to each other. However, the blur that remained is just one or two spots where I believe I may have caused movement either to the ball itself (which by the way is far less stable underwater sitting on it's little metal nut I glued to it with a British coin onto it as well) than one might think. It did NOT take much of a swoosh to move the artifact or the ball... or possibly I moved the actual iron cylindrical base plate i used to hold up the artifact and tie a string to it. Since those pixels are now off relative to the others in the set of pics, they could not be aligned. It is ever so minor.. but present. I learned a lot from the first attempt. The prototype testing device I am building in a couple weeks will be an acrylic 8-armed 'octopus' that holds the camera mounted at the top, center, and pointing down. Mounted to the arms will be LED's (four to an arm) and powered from the surface for a laboratory test tank environment. Then… when I go to the field, I will pull the LED's off it, but still use it as a 'light position template' and camera holder. I can dive down with the octopus, position it over an artifact... set the camera and then let it start firing away exposures… and all i have to do is move the light parallel to and in-between each acrylic arm. It will require no string or contact with neither the object nor the camera at any time. I expect very good results. Also, the light I used had no diffuser on it. It was a very bright and fairly focused beam. I think I can put a diffuser over it, reduce the output slightly, slow down the shutter speed to compensate, and get a much more even light coverage on the object. Already my professors are talking 'video cameras and wifi and real-time processing'... but I will be most happy at this stage just getting the process down to duplicate positive results consistently in a tank of water and with a tank on my back!. Thank you for your positive feedback.
  9. ptm file, info file, and jpg capture of the snooker ball light layer in 'RTI underwater' Google Drive folder. https://drive.google.com/folderview?id=0B_bmKMWOjrbEMnNpMm1zcmIxZEk&usp=sharing
  10. Mission accomplished. ptm file of underwater data set as nice as if it had been done under a dome in the sudio. There were a couple of spots where I apparently put slight motion in either the object or the camera during light position transition, although the vast majority of pics stacked flawlessly. Still, the few motion errors were enough to blur the image from those angles in the RTI Viewer. So I took the pics and ran them through a PhotoShop app that aligned them at the pixel level (with some help..haha.) It took about ten minutes. Then a second attempt at a ptm file produced excellent results. Any suggestions on how I can continue to share the results? I think I only have 38k of upload space left on this forum and that wont even be enough to post the project proposal.
  11. Here is a few samples from the raw set: oh never mind. Looks like I've reached my CHI "upload limit" haha
  12. George nailed it. The 250 lumen little hand held didn't cut it in the water. Took 1500 pics today. Seven complete sets around the object. At least ONE of the sets WORKED! When I threw 1000 lumens on it with my 15wat HID I got results as nice as under the dome in the lab. So after HOURS of sorting through all these pics, breaking them down into their seven folders, sorting through the 200+ in 'set #4' (HID set) I've pulled 80 pics. 13 positions around the object with 5-7 pics in the horizontal per position. I SO wanted to process these tonight and post a few jpegs of the underwater results next to the lab dome results...but I cant figure out how to make the damn RTI Builder software work. It keeps saying "the jpeg-exports folder is missing in your project directory"... so I gave up. Made Facebook appointment on Monday to process them back in the lab.
  13. This is the LED supplier I've been in contact with. http://www.leds.co.uk/ The attachment is the LED they suggest I use for the project. They are 490 lumens each. That is twice the output of the torch I am using today in the pool. However, the torch is focussed through a lens. These will not be. I'm guessing they will be perfect and should survive freshwater submerging. Test2.pdf
  14. Thank you for the words of encouragement. @George: embarrassed to say but I’m not 100% sure what you are asking I’m guessing you would like me to take the camera out of the contraption and take a photogrammetry capture set as well. Will do. @Taylor: 60 degrees, not a problem. More than enough room to do so. I had said 45 only because in our team meeting the professors had suggested 45 because they found the higher position apparently didn’t add that much to the surface studio results. Truth is, I’ll be wingin’ it anyway. It is just another 5 second movement one position higher so why not. There is a question as to whether or not the black snooker ball will sufficiently reflect in the pool environment and there was talk in the meeting about using a ‘mirrored’ ball. I am also bringing a 1000 lumen HID cave diving light. I will try them all. Finally, I REALLY appreciate the articles you guys keep sending. On the one hand, this is ‘research’ and stands on its own as such… but on the other hand it is a ‘graded dissertation’ on which a thorough review of past and present literature and work is critical to the grade. So thank you. @Marlin: I have to generate 20,000 words on this when all is said and done… I am documenting EVERYTHING! Today the GoPro will be on my forehead… haha. UPDATE: The Mary Rose Museum Conservation Department at the naval dockyards in Portsmouth, UK has opened up their wood shop and acrylics fabrication shop with laser cutter to me in support of the project. This is very good news. With the help of their master carpenter, my intention is to build an 8 arm underwater umbrella out of acrylics. Each arm will have four LED’s mounted (total of 32). The device will sit inside a water tank with the LEDs powered from the surface. The goal is to create datasets that determine the effects of turbidity on the quality/usability of RTI output. I remain committed to 2 paths. The indoor laboratory setting, and the outside field data gathering setting. If the pool session works today I will learn all I need to know to be able to pull this off on an 18th century British wreck site off the coast of Spain this June. Instead of the acrylic umbrella being the light source in the field, it will serve simply as a ‘template/guide’ with which to ‘freehand’ move my handheld torch. (i.e….dive down on the wreck, position the umbrella over the object, drop camera in holder, adjust settings, then begin continuous capture mode while moving the light using the arms only for positional reference. That way at this early stage of research, I do not need to build a fully submersible electronic device, but will hopefully be able to duplicate ‘laboratory control’ in light positioning in the field.) Two hours to pool time. Cheers.
  15. Very short notice pool session this Thursday, May 2nd, 2013. Dropped what i was doing and spent an entire day at the University of Southampton Center for Maritime Archaeology “dive shed” in the back. The effort has produced THIS. Keep in mind… not a single “tool” was employed in the construction of this (minus my trusty pocket knife.) Y? Because the CMA doesn’t have any. The entire thing has been assembled with stuff lying around in the shed. Archaeology planning framing iron, a piece of aluminum ladder, a red iron welded ‘box thingy’, and a 30’ Stilson pipe wrench that weighs about 20 pounds. The entire thing has been put together with duct tape and string. An assessment of all my available underwater lights left me with an Intova 250 lumen LED wide angle (back up cave diving light) as the light of choice. (Tom Malzbender personally scrutinized my half a dozen underwater lights last night on Skype…had to aim the lap top cam at the ceiling and put on a light show for him… haha. The beam is bright enough for photography within a meter, and features neither hotspots nor dead spots in the center of the beam (unlike my HID’s and various other high intensity LED’s that are focused.) The focal plane is approx. 60 cm from the camera. (Maybe a bit far? I may lift the object up off the pool floor some to bring it closer.) The department’s underwater SLR camera may or may not be available this Thursday for the pool session. Therefore, the camera I’m using is my trusty 12meg Fujifilm FinePix F200EXR digital and it likes to attempt to auto refocus between every shot. However, when set on continuous mode (firing 1 pic every 2 seconds as long as the bat and disk will go), and testing batches of approx 150 pics at a time, 90% remain in focus when I DON’T change the lighting (ha ha.) This percentage will surely go down when I move the light to 32 different positions, but I believe I will be able to weed them out of the capture set before importing the batch of pics for processing. I am aiming to capture a photo from 4 positions (starting at the 45 degree, then 3 positions lower approximately 10 degree drops) in 8 arcs equidistant around the object on the pool floor. The object is a small piece of 18th century ‘deadeye’ wooden ship rigging (about the size of your hand) I found on a survey project I’m working on. I have a nice surface RTI ptm file of it for control/comparison. A heavy iron cylinder will rest on the pool floor (seen in the pic below) below the camera with the object sitting on it (I will cover it with a small piece of white cloth for contrast) and the snooker ball off the side in the same focal plane. A cave line lanyard loop is able to spin freely 360 degrees underneath the object, so I can pretty much manually control the torch and keep it at the right distance. This is NOT how I wanted to start this project out!!! But, hey… adapt, improvise, overcome. (Incidentally, the project proposal for the ACTUAL project is still not fully drafted. Due date is May 16th. I will post a draft prior to submission. Maybe you guys can shoot holes in it, and then I will edit and hopefully get a better grade.