PhD Studentship in Photonics for Environmental Monitoring

Update – this position has been filled. We do however welcome applications for PhD Studentships in any of our research  areas at any time.

Department of Engineering Photonics
Cranfield University, Bedfordshire, UK

Fully funded: £13,726 pa stipend, tax free*
Tuition fees paid*
Start date Oct 2013. Application deadline 16th June 2013

A PhD studentship is available in photonics based gas sensing, fully funded by the Natural Environment Research Council (NERC). The aim is to develop new instrumentation for atmospheric monitoring in collaboration with the NERC / Met Office Facility for Airborne Atmospheric Measurement (FAAM), which is also based at Cranfield.

If you want to join us, there is more information on this link

*  Stipend rate for 2013-2014. UK residents only. Eligibility rules apply.

 

COS Reviews: All-fiber multifunction continuous-wave coherent laser radar at 1.55μmm for range, speed, vibration, and windmeasurements

Review of ‘All-fiber multifunction continuous-wave coherent laser radar at 1.55μmm for range, speed, vibration, and windmeasurements, Christer J. Karlsson, Fredrik Å. A. Olsson, Dietmar Letalick, and Michael Harris, Applied Optics, vol. 39 (21), (2000)’

Reviewed by John Davenport and the Cranfield Optics Society.

This paper describes the function of newly developed coherent laser radar (CLR) system and its application to wind measurements, hard target measurements and vibrometry. Along with basic scientific and photonics terminology, the paper assumes the reader is familiar with LIDARoperation. Readers not already familiar with these areas would be advised to check them in advance.

The structure of the paper was good, with well laid out and clear sections. This is particularly valuable as it is relatively long and quite involved. Where equations are used, they are quoted clearly in the text with a table of terms used given part way though. A reader wishing to locate part of the work within the paper would not have difficulty doing so.

Technical details given in this paper is probably its main strength. Full details of how the system was put together are included and a qualified reader is provided with enough information to repeat the work. The results section is broken down into noise analysis, hard target measurements, wind measurements and vibrometry measurements (more on this later). Each is analysed rigorously and concisely. The potential and limitations of the system are clearly stated.

Hard target measurements are taken using a sandblasted aluminium plates, as stationary or moving targets. Over distances or several kilometres, range measurements can be taken with accuracies of a few metres and the line-of-sight component of velocity can be measured to within around 0.1ms-1. Results conform reasonably well to theoretical expectations although there is some uncertainty as to the contribution of turbulence.

Two methods of wind measurements are discussed, using back scattering from a region of air, and focusing onto a single aerosol particle. The former allows measurements to be taken even under clear atmospheric conditions, showing improvements on earlier systems. The second is novel to this study and presents the opportunity for increased sensitivity.

The system is described as ‘multi function’, being about to perform position and velocity measurements of wind, similar measurements on of hard targets, and long range vibrometry measurements. While the first two are clearly different functions, it was our opinion that the vibrometrymeasurements were not fundamentally different from hard target measurements and did not really warrant a separate mention.

To summarise, the paper describes the CLR system, concentrating on technical detail. Details of both construction and testing are presented, describing use for wind measurements and hard target measurements. Layout is clear and easy to follow. This paper is recommended for readers wishing to work with CLR or LIDAR systems or wishing to understand what they are capable of.

The wheels fell off

404-ed
Pictured above, a non-ideal funding cycle

You know that long and frustrating funding cycle I talked about a while back, well the wheels have fallen off mine causing both a horrible mixed metaphor and my contract to runout.

When I finished my PhD, the Department of Engineering Photonics very kindly offered to keep me on as a PostDocs. As I mentioned before, there was no specific funding or project for me to work on, but they seemed to want to keep me and they found some money to pay my salary. The idea was that they would fund me until I managed to get a grant to pay my own way. This is a fairly typical situation that many PostDocs find themselves in – bring in grants or look forward to spending a lot more time at home…

So my main focus of the last 6 months has been identifying and applying for grants. It’s not the most fun you get to have a researcher, but it is a necessary evil. To be honest, I quite enjoy writing grant applications as it is in essence, an exercise in communicating your work and ideas to an unknown audience – which as you may have noticed from this blog, is something I quite enjoy doing.

The problem with this exercise in preparing grants, is that I can’t just bang one out and send it off; they take time to prepare and even longer to authorise (not as long as some of my other funding ideas but still quite a while). This is especially true of the grants I was preparing as collaborations with other groups. It is estimated that any grant from conception to money (assuming it’s successful) takes around 9-12 months. Unfortunately, it turns out that the department only had money to keep me on for 6 months – which, as those of you who are good at maths may have already realised, is less than 9-12 months .

As I said; originally it was hoped that the department would keep me on for long enough for one or more of these grants to mature but due to a number of external factors the department cannot afford to do this. In fact, a number of other grants are finishing around the same time and several of my colleges will also be leaving the department in the coming months. Such is the nature of current research funding – there is very little room to keep people on without project grants.

So what next…?

Well for me, I have been applying like crazy for another post-Doc position at other Universities around the country. While I would have very much liked to get some of my projects funded, I am quite looking forward to working in a new group and learning a whole new branch of science. I am also hoping that by moving to a bigger university I can look to taking my PGCEP (lecturing qualification) and getting back to the supervisory stuff I used to do during my time in Industry. While I am waiting to hear back about those PostDocs positions, I don’t plan on being idle with my time and you can follow what I’ll be up to on my new blog – The Errant Scientist.

Open Optics has been fantastic to work on for the department; I’ve really enjoyed writing it and it’s been really good fun talking to people about it and getting feedback from a huge variety of individuals. From a department perspective, it’s also been great in getting our work more publicity – Jane’s review article had a big up-tick in downloads after we publicised it here and is current sitting at the number 2 most frequently downloaded paper in the journal. Steve will be looking after the blog without me and has promised to keep it as up-to-date as possible. He won’t be posting quite as regularly as me, but I am pleased that it is now seen as an important tool of the department.

Wavey ByeTL;DR – My contract is up and I’m leaving – because working for free is not good at feeding a hungry family. I am applying to new PostDocs positions elsewhere but in the meantime I am working as a consultant again and you can read all about my current projects on errantscience.com

(TL;DR = too long, didn’t read)

"Engineering optics, from bench top to bedside"

The head of our department organised a talk yesterday by Bruce Tromberg. Bruce Tromberg is professor of Biomedical Engineering; professor (jointly) of the School of Medicine at the University of California, Irvine; and a Director of the Beckman Laser Institute. Additionally, he is also…. okay, I’m going to stop there as I’ve just found his CV posted on his homepage and it runs to 59 pages and I only wanted to write a 1000 word article. I think you can take it for granted that Professor Tromberg is very much at the bleeding edge of optics and is highly respected for his work in the application of optical techniques. If you’d like to know more you can visit his wikipedia page. (yes, he has a wikipedia page!)

Professor Tromberg’s original talk was to be focused on “Medical Imaging in Thick Tissues Using Diffuse Optics” – a field that is one of many focuses of his group’s work at UCIrvine. However, when he was discussing the current activities of our group on his way into the University, he decided that actually we would most likely enjoy a broader talk which focused on numerous applications of photonics. So about 2 hours before he was due to give the talk, he re-wrote the presentation (by combining a few others, I assume…) and made us a whole new one and a half hour presentation on “Engineering Optics – from bench top to bedside”. Below are my notes on this talk.

———————————————————————–

Professor Tromberg’s first degree is in chemistry but his career has ended up moving him more towards photonics and medical applications. In addition to his many other roles, he is a professor of surgery (non-practicing) and has very close ties to active medical groups. His group is part of the Beckman Foundation and consists of around 20 faculty and ~120 researchers. The Beckman foundation is a group aimed at promoting research in chemistry and the life sciences, with a focus on developing new and innovated instruments and materials, as well as supporting young researchers – you can read more here.

Currently, a number of medical imaging technologies work on the principle of bringing the patient to the device. This can be seen for CAT scans and MRIs – both of which have a very high initial capital cost, high running costs and require skilled and often dedicated operators. The ideal model for future systems is to move from this patient-to-technology model and develop devices that can be used in a wider range of settings at the point of care (e.g. at home or in local clinics). Current working examples of photonics being used to promote this move to point-of-care include simple things such as smart phone apps that are capable of accurately recording your pulse.

NOTE: I had heard of these heart rate apps before but I had assumed they were one of many scam/hoax apps on the phone and wouldn’t be any good. However, after Professor Tromberg’s talk we downloaded one and tried it out and found them to be very accurate. I strongly recommend that you download one and give it a go (links to the apps we used are: iPhone or Android).

Other examples of enabling technologies derived from photonics, range from the now very common LASIK to the more recent use of optical coherence tomography to study and understand conditions of the eye.

The next phase in the exploitation of photonics is further into the field of diagnostics and using photonics systems to see beneath the surface of the disease. There are a wide range of techniques that can achieve this, ranging from nanoscopy (high resolution but short penetration) to diffuse tomogrpahy (deeper penetration at the cost of resolution).

Graph
Reproduced from Professor Tromberg’s talk with permission

Non-Linear Optical Microscopy

One technique that Professor Tromberg has worked with is using Non-linear Optical Microscopy in a clinical setting. These microscopes use a number of optical effects to image a number of structures within the dermal layers – collagen has a clear second harmonic generation; cells (specifically the NADP and FAD within them) show a florence reaction; structural features tend to be strong scatterers; and finally, lipids can be imaged using a technique known as ‘Coherent anti-stokes ramen spectroscopy’ (CARS). Some of the most striking uses of this technology was in the imaging of suspected lesions within the skin of a patient. In the image below you can clearly see individual cells and the distinctive change in morphology in possible lesions.

Reproduced from Professor Tromberg's talk with permission
Reproduced from Professor Tromberg’s talk with permission

These microscopes have now been commercialised and are used in hospitals across the US, and are finding widespread use in looking at a number of biologically relevant structures.

In Vivo Microendoscopy

Mouse cortex
Doppler OCT image of a mouse cortex. Reproduced from Professor Tromberg’s talk with permission

Also within the UCIrvine group, Professor Zhongpin Chen has been at the forefront of the use of a number of in vivo optical techniques which allow for the imaging of internal lumens (e.g. airways and blood vessels). The combination of OCT and ultrasound allow for the building up of a complex 3D picture of the walls around the vessel and in the case of blood vessels, the build up of plaque. This technique is allowing doctors, for the first time, to accurately image regions ahead of stent implant surgery and subsequent monitoring of possible re-growth around the stent.

This kind of OCT imaging can also be used to create 3D maps of blood vessels using doppler OCT. This technique relies on calculating the correlation between multiple images to show the moving areas within a depth of around 5mm.

In vivo macroscopic imaging

This area of photonics imaging can be considered the domain of wide field cameras. This kind of imaging is often works by analysing reflected light from the surface of the skin over a wide area using one or more specialist illumination methods. Spatial frequency domain imaging (SFDI) is one such technique that is being used by surgeons within the Beckman Institute and uses a similar method to the Microsoft Kinect system for the xbox 360 – it projects a series of lines in the IR range on to the subject and then alters the frequency of these lines. Using the reflected data, it is then possible to build up a map of the underlying blood vessel structures to a result with low resolution (~5mm). In addition to the structure it is also possible to derive the concentration of oxygenated haemoglobin which is highly significant for a number of clinical applications.

Skin patterns
SFID imaging of blood vessels. Reproduced from Professor Tromberg’s talk with permission

Despite the relatively low resolution, this system is being used by surgeons to determine the blood oxygenation of tissues during surgery. If for example, a flap of skin that is being manipulated during surgery looses too much oxygen supply, it may cause complications in future recovery. By monitoring a live image of the oxygenation of that tissue, the surgeon can see what further action needs to be taken.

NOTE: Professor Tromberg showed an excellent movie of this being used mid-operation however, it didn’t transfer over correctly so I can’t include it in this post.

This kind of macroscopic imaging is also being used along with fluorescent tags in cancer surgery to identify the lymph nodes within the patient. Traditionally, this was done using blue dye injected into the patient and visually tracked by the surgeon. However, this process can be error prone and deeper lymph nodes can be missed. The use of macroscopic imaging techniques can improve this by showing the surgeon the movement of the dye through a number of possibly hidden structures.

Diffuse optics

Finally, Professor Tromberg talked about the use of optical techniques to examine  biologically relevant information from a large area with diffused light sources. An excellent example of this being put in to practice is the use of NIR blood oxygen monitoring before and during surgery to improve survival rates and to identify high risk patients. This technology has now also been expanded to monitor patient reaction to common anaesthetics and provide a wealth of new insight into how they can influence patent recovery.

Diffuse optics are also proving vital in breast cancer screening for patients with dense mammary tissue that is difficult to resolve in routine screening procedures. These techniques help separate the higher risk patients and prevent either un-neccassary biopsies or missed diagnosis. Also like the monitoring of blood oxygen levels these diffuse techniques can be used to provide constant monitoring to asses the patient’s state and reaction to chemotherapy medication providing new insight for the clinicians treating the tumour.

Summing up

Professor Tromberg’s concluding remarks were that the work his group has undertaken would not have been possible without the links to other fields which provide insight into application requirements. Having researchers that are trained in both photonics and healthcare, has been vital to developing novel techniques to solve long standing medical problems that wouldn’t otherwise be apparent to people specialising in only photonics. He again re-iterated this in answer to Dr James’ question at the end of the lecture, regarding how to manage cross-discipline projects – “Ideally you need a dedicated SWAT team that can find applications for techniques you work with. A mixed team is vital is developing those technologies further”.

“Engineering optics, from bench top to bedside”

The head of our department organised a talk yesterday by Bruce Tromberg. Bruce Tromberg is professor of Biomedical Engineering; professor (jointly) of the School of Medicine at the University of California, Irvine; and a Director of the Beckman Laser Institute. Additionally, he is also…. okay, I’m going to stop there as I’ve just found his CV posted on his homepage and it runs to 59 pages and I only wanted to write a 1000 word article. I think you can take it for granted that Professor Tromberg is very much at the bleeding edge of optics and is highly respected for his work in the application of optical techniques. If you’d like to know more you can visit his wikipedia page. (yes, he has a wikipedia page!)

Professor Tromberg’s original talk was to be focused on “Medical Imaging in Thick Tissues Using Diffuse Optics” – a field that is one of many focuses of his group’s work at UCIrvine. However, when he was discussing the current activities of our group on his way into the University, he decided that actually we would most likely enjoy a broader talk which focused on numerous applications of photonics. So about 2 hours before he was due to give the talk, he re-wrote the presentation (by combining a few others, I assume…) and made us a whole new one and a half hour presentation on “Engineering Optics – from bench top to bedside”. Below are my notes on this talk.

———————————————————————–

Professor Tromberg’s first degree is in chemistry but his career has ended up moving him more towards photonics and medical applications. In addition to his many other roles, he is a professor of surgery (non-practicing) and has very close ties to active medical groups. His group is part of the Beckman Foundation and consists of around 20 faculty and ~120 researchers. The Beckman foundation is a group aimed at promoting research in chemistry and the life sciences, with a focus on developing new and innovated instruments and materials, as well as supporting young researchers – you can read more here.

Currently, a number of medical imaging technologies work on the principle of bringing the patient to the device. This can be seen for CAT scans and MRIs – both of which have a very high initial capital cost, high running costs and require skilled and often dedicated operators. The ideal model for future systems is to move from this patient-to-technology model and develop devices that can be used in a wider range of settings at the point of care (e.g. at home or in local clinics). Current working examples of photonics being used to promote this move to point-of-care include simple things such as smart phone apps that are capable of accurately recording your pulse.

NOTE: I had heard of these heart rate apps before but I had assumed they were one of many scam/hoax apps on the phone and wouldn’t be any good. However, after Professor Tromberg’s talk we downloaded one and tried it out and found them to be very accurate. I strongly recommend that you download one and give it a go (links to the apps we used are: iPhone or Android).

Other examples of enabling technologies derived from photonics, range from the now very common LASIK to the more recent use of optical coherence tomography to study and understand conditions of the eye.

The next phase in the exploitation of photonics is further into the field of diagnostics and using photonics systems to see beneath the surface of the disease. There are a wide range of techniques that can achieve this, ranging from nanoscopy (high resolution but short penetration) to diffuse tomogrpahy (deeper penetration at the cost of resolution).

Graph
Reproduced from Professor Tromberg’s talk with permission

Non-Linear Optical Microscopy

One technique that Professor Tromberg has worked with is using Non-linear Optical Microscopy in a clinical setting. These microscopes use a number of optical effects to image a number of structures within the dermal layers – collagen has a clear second harmonic generation; cells (specifically the NADP and FAD within them) show a florence reaction; structural features tend to be strong scatterers; and finally, lipids can be imaged using a technique known as ‘Coherent anti-stokes ramen spectroscopy’ (CARS). Some of the most striking uses of this technology was in the imaging of suspected lesions within the skin of a patient. In the image below you can clearly see individual cells and the distinctive change in morphology in possible lesions.

Reproduced from Professor Tromberg's talk with permission
Reproduced from Professor Tromberg’s talk with permission

These microscopes have now been commercialised and are used in hospitals across the US, and are finding widespread use in looking at a number of biologically relevant structures.

In Vivo Microendoscopy

Mouse cortex
Doppler OCT image of a mouse cortex. Reproduced from Professor Tromberg’s talk with permission

Also within the UCIrvine group, Professor Zhongpin Chen has been at the forefront of the use of a number of in vivo optical techniques which allow for the imaging of internal lumens (e.g. airways and blood vessels). The combination of OCT and ultrasound allow for the building up of a complex 3D picture of the walls around the vessel and in the case of blood vessels, the build up of plaque. This technique is allowing doctors, for the first time, to accurately image regions ahead of stent implant surgery and subsequent monitoring of possible re-growth around the stent.

This kind of OCT imaging can also be used to create 3D maps of blood vessels using doppler OCT. This technique relies on calculating the correlation between multiple images to show the moving areas within a depth of around 5mm.

In vivo macroscopic imaging

This area of photonics imaging can be considered the domain of wide field cameras. This kind of imaging is often works by analysing reflected light from the surface of the skin over a wide area using one or more specialist illumination methods. Spatial frequency domain imaging (SFDI) is one such technique that is being used by surgeons within the Beckman Institute and uses a similar method to the Microsoft Kinect system for the xbox 360 – it projects a series of lines in the IR range on to the subject and then alters the frequency of these lines. Using the reflected data, it is then possible to build up a map of the underlying blood vessel structures to a result with low resolution (~5mm). In addition to the structure it is also possible to derive the concentration of oxygenated haemoglobin which is highly significant for a number of clinical applications.

Skin patterns
SFID imaging of blood vessels. Reproduced from Professor Tromberg’s talk with permission

Despite the relatively low resolution, this system is being used by surgeons to determine the blood oxygenation of tissues during surgery. If for example, a flap of skin that is being manipulated during surgery looses too much oxygen supply, it may cause complications in future recovery. By monitoring a live image of the oxygenation of that tissue, the surgeon can see what further action needs to be taken.

NOTE: Professor Tromberg showed an excellent movie of this being used mid-operation however, it didn’t transfer over correctly so I can’t include it in this post.

This kind of macroscopic imaging is also being used along with fluorescent tags in cancer surgery to identify the lymph nodes within the patient. Traditionally, this was done using blue dye injected into the patient and visually tracked by the surgeon. However, this process can be error prone and deeper lymph nodes can be missed. The use of macroscopic imaging techniques can improve this by showing the surgeon the movement of the dye through a number of possibly hidden structures.

Diffuse optics

Finally, Professor Tromberg talked about the use of optical techniques to examine  biologically relevant information from a large area with diffused light sources. An excellent example of this being put in to practice is the use of NIR blood oxygen monitoring before and during surgery to improve survival rates and to identify high risk patients. This technology has now also been expanded to monitor patient reaction to common anaesthetics and provide a wealth of new insight into how they can influence patent recovery.

Diffuse optics are also proving vital in breast cancer screening for patients with dense mammary tissue that is difficult to resolve in routine screening procedures. These techniques help separate the higher risk patients and prevent either un-neccassary biopsies or missed diagnosis. Also like the monitoring of blood oxygen levels these diffuse techniques can be used to provide constant monitoring to asses the patient’s state and reaction to chemotherapy medication providing new insight for the clinicians treating the tumour.

Summing up

Professor Tromberg’s concluding remarks were that the work his group has undertaken would not have been possible without the links to other fields which provide insight into application requirements. Having researchers that are trained in both photonics and healthcare, has been vital to developing novel techniques to solve long standing medical problems that wouldn’t otherwise be apparent to people specialising in only photonics. He again re-iterated this in answer to Dr James’ question at the end of the lecture, regarding how to manage cross-discipline projects – “Ideally you need a dedicated SWAT team that can find applications for techniques you work with. A mixed team is vital is developing those technologies further”.

Buying things is hard

As you may have gathered by my not very oblique references and down right obvious twittering, I have been busy working on a new project over the last few weeks. The project is in its early stages and at this point, I am mostly checking that my data is real before trying to persuade my boss to actually run with it as a project. However, one thing I do need to know is : if I wanted to take it further – how much would it cost in various supplies etc. So yesterday, I took to the various suppliers of lab equipment to track down the long list of things I need to get it up and running.

Over the years, I have had significant experience of lab supply websites. My previous life at Mediwatch meant dealing with a lot of suppliers and I can say that with only a few exceptions, it is always a huge migraine like headache. To give you some idea of the type of site I’m talking about i’ve made a mock up of a generic lab supplies website (see below).

Note: this doesn’t include the inevitable pop-up asking if I’d like to try their new version of the site – which appeared on 5 out of the 8 sites I flicked through while researching this article.

Annotated with cynical sarcasm
Annotated with cynical sarcasm

If I am honest, most sites are actually pretty nice to look at (no spinning GIFs or yellow text on a blue background) and their owners are clearly trying to make them as easy to use as possible. However, there are 5 common annoyances that few seem to have solved.

  1. Obligatory laboratory models – Who are these people dressing in lab coats to be photographed and why are they so interested in whatever coloured beaker they are holding?
  2. Search bars – Like a fool, the first thing I do on a website is try out the internal search system. Sadly, I have been spoilt by the likes of Amazon and e-Bay where the search function will actually show me all the products on offer at that store. Lab suppliers have not caught on to this and frequently use search bars that are so broken they either show nothing or absolutely everything that might have some of the letters I used in the search…
  3. Categories – Having given up on the idea of just searching for what I want, I then turn to my last resort – categories. My best guess for these, is that they are prepared by just one person; so what you end up with is their slightly eclectic groupings – many of which only make sense to them. My real world example of this is when I found a retort stand was under ‘Chemistry>Experimental Equipment’ but not under the broader ‘Lab Equipment’ available on the main page.
  4. Stock levels – A sales person from a large lab equipment supply company once told me that they always have a minimum of 3 of any item on their website. They also explained that 50% of the time they don’t actually have the item but order it in from their suppliers on an ‘as needed’ basis to save on having too much warehouse stock.
  5. Contact form – Why can’t I just e-mail companies?? Contact forms are so impersonal and you are never quite sure if anyone actually got them. [ED: we have a contact form...] But then again, they are very neat and tidy and clearly better.

Now having, by some miracle, found my item (or the nearest equivalent I’m prepared to settle for to avoid more searching…) this is where things have actually improved a lot. In the not too distant past, many sites would make you contact them to get a quote on a £5.00 part – apparently the prices sheet was a closely guarded secret and would not be revealed unless personally authorised by the dark god of sales. Mercifully, they have seen sense on this and now almost all sites will give you the prices up front and in some rare cases they are then even mad enough to ship the part!

I wonder if Amazon have a lab supplies section

ImageJ is amazing

A long time ago in a lab about 20 miles away, I was working for a company called Mediwatch to develop a new micro-array platform that was internally named Zero-flow. It was a nifty little device that was excellent at controlling the flow of a sample over a sensor system. The company I worked for at the time was small and not flush with cash. So for a while the only resources allocated to the design and development of this technology was me. With this budget of £0 I needed to produce an assay system that could demonstrate sensitivity significant enough to warrant further development. So I did what any over-enthusiastic scientist would do and I bodged together various bits of kit from home and lying around our very sparse lab to build a half decent microarray fluorescent reader system. It was about 90% cardboard and it fell over if you walked past too quickly but it did the job and produced some pretty good photos.

For anyone interested the dots were made using a Lego X-Y-Z plotter and the top row show a reaction to 1ug IgG the bottom row 1ng IgG
For anyone interested the dots were made using a Lego X-Y-Z plotter and the top row shows a reaction to 1µg IgG and the bottom row the reaction to 1ng IgG

However as nice as these pretty pictures are, the one thing that I didn’t have any access to was image analysis software so I had no way of saying how much of a dot these dots were. This is where ImageJ came in.

ImageJ is free software package that is to image analysis what real butter is to a piece of toast – vital and delicious. Authored by Wayne Rasband at the research services branch of the National Institute of Mental Health in Bethesda, it is an amazing suite of very useful image analysis tools that are totally free!

In the image I showed earlier, if for example, I wanted to know how bright those two dots are – then ImageJ can just simply show me the image intensity over an area by selecting a transect across the dot which ImageJ then plots.

Okay, so it's good with images but the plot function could do with a little jazzing up!
Okay, so it’s good with images but the plot function could do with a little jazzing up!

So in no time at all, I can switch from messy photographs to raw data about the profile and intensity of my results. This kind of image analysis is vital for a whole range of the work I do and ImageJ is a fantastic workhorse. I’ve even managed to use ImageJ for more mundane tasks such as calibrating the monolayer troughs I’ve used in the past for coating experiments. During the experiments it is vital to know what area of monolayer is being held on the surface of the trough so it can be dynamically controlled. However, over time the motors can drift and the original calibration can shift so the area needs re-measuring. Previously, people had done this by flipping over the 5kg trough and playing with the little pot motors until they got a sensible value – not very practical. A much easier way was to use ImageJ to calculate the area for me – I took a photograph at the open and closed position of the trough (with a ruler on the trough for scale) and then ImageJ simply provided the answer to the selected area.

I cannot understand why they ever made these troughs sort of awkwardly circular...
I cannot understand why they ever made these troughs sort of awkwardly circular…

Beyond these simple analysis tools, ImageJ is capable of an insane number of other cool things. For example, when showing people the result from the fluorescent photographs from before, one simple way to make it more visually accessible was to convert the 2D photographs into the smoothed 3D surface plots.

Single 1µg dot
Single 1µg dot

Or if I wanted to show a surface within a 3D space, I can simply convert it a rotating 3D animation.

If you stare at it for 1000 rotations there is something wrong with you
If you stare at it for 1000 rotations there is something wrong with you

One last thing that i’ve just discovered is that I can hook ImageJ into python so in the future I can have these animation auto generating along side all my data analysis!

Summary: I quite like ImageJ – it makes pretty pictures. I just wish that other authors would be as keen to open up their software and make it available to the community.

Good (paper) lab book house keeping

A few weeks back, I asked the community at large for advice on where to go to set up an online open lab book. The response was fantastic and I have a whole list of places to look to for online support. However, before I jump in to the deep end, and for the benefit of those that are either not scientists – or just quite lazy about lab book keeping – I thought I would take a moment to explain why lab books are important and how you should be using them.

Academic research cartoon

At its core, a lab book is just that – a book in which you record your lab activities. Historically, these are interesting but they are secondary to the discoveries their owners have made, as lab books were used as much or as little as the owner felt like. For example, Leonardo Da Vinci’s note book is impeccably detailed and annotated and full of beautiful drawings and doodles of inventions, whereas Isaac Newton’s is a wall of impenetrable text in hand writing that a drunken spider would be ashamed of. Interestingly, this wasn’t just a difference of Leonardo being artistically talented and using it in his lab notes; there is some evidence that Isaac Newton deliberately complicated his notes with riddles and confusing wording in order to hide the meaning from anyone that he deemed not intellectually worthy of its content. In some ways, this over complication is not far off how scientific publications work today, sadly. However, despite the difference in style, both Isaac and Leonardo used their lab books to record their findings/thoughts and they would later use these notes to present their work as books and periodicals. The notes themselves were personal in nature and not designed for sharing or disseminating their work to others.

As science changed, so did the lab books. Companies running teams of scientists quickly realised that lab books were critical to sharing research between related teams and for retaining knowledge when a scientist leaves to work somewhere else. If all scientist X’s lab notes are written in short hand that only they can understand then that is no use to the company that has just paid him for the research! Outside of the commercial world, this sharing lab notes is vital to ensuing that you retain the fine detailed knowledge of the experiments that is often lost when it is prepared for publication. Good lab book keeping can ensure that new scientists can learn more quickly from those more experienced and not be doomed to repeat the same mistakes (and any experienced scientists should have plenty of mistakes/failed experiments to learn from!).

So to help any aspiring scientist write up a good lab book, I thought it would be useful to list a few top tips for paper based lab books. I have not quite worked out how exactly some of these fit with using an electronic lab book but that is an on-going project and I’ll post a guide for electronic lab books just as soon as I work out the best way of doing it!

  1. Always write in pen, never ever pencil. This is for two reasons, firstly pencil can be altered which will ruin any chance of using your lab book as evidence that you were developing a technology first, secondly you should just write in your lab book and not get too hung up of neatness or getting things wrong. If you make a calculation error that you need to correct – cross it out and put the right value in if required. Having the mistake visible will remind you next time not to make the same error and to double check figures that you have had a problem with in the past.
  2. Don’t remove/white out anything. If you didn’t get it from tip 1. then I feel it needs repeating – ‘don’t remove stuff’. Even errors, cock-ups and experiments that cause unintended fireballs need writing up – just because you got something wrong is no reason not to make a note of it.
  3. Write in chronological order. Okay, this is a bit of a no brainer but I have seen lab books that people have just randomly scrawled in and they quickly become incomprehensible to even the author.
  4. Date everything. You may not need to re-visit some of your work until months or even years later and you will have forgotten when you did a particular experiment. This is particularly important for matching up computer data to a particular experiment.
  5. Write out the aims and materials of an experiment before you start the experiment. This step is a pretty good habit, as thinking through your experiment aim beforehand is a very good way of double checking that you are actually running the right experiment. Writing the materials down in advance is similarly an important way of double checking you are all setup and ready to run. Personally, I also find that once I’ve run the experiment, I am much more interested in running the next one rather than writing up the thing that has just failed…
  6. Use your brain when including data. Ideally all results should go in your lab book but if you are generating GBs of data then this is probably inappropriate. If you can’t put the data in your lab-book then make sure that at the very least, you paste in a summary graph or table.
  7. Maintain an index. This saves sooooo much time later when you are trying to find that experiment you did, you know the one with the beaker and that solution.

There are a few more things you should/shouldn’t do but they vary a lot depending on your field and where you work. From my experience training up a few new scientists the above tips are the ares that people most commonly seem to skip or forget. In the interests of openness, I have included the scanned page of my lab book I posted for “show you lab books day” a while back. I have highlighted the ares where I’ve failed to heed my own advice.

You'll be pleased to hear I gave myself detention for this
You’ll be pleased to hear I gave myself detention for this

Busy

Sorry about this but there is no proper blog post sorted for this week.

My free time this week has been spent dabbing a small child with calamine lotion after he declared on sunday morning that he “feels a bit itchy’. Normally when my free time/personal life is a bit hectic I try to find some room in my work schedule to put a post together however, I decided to pop into the lab on monday to do a quick experiment and have only emerged since to excitably wave results at my colleagues.

So no blog post…..but I don’t want anyone to feel short changed so instead please enjoy this picture of some poorly labeled data. I can’t explain why yet but this is very very exciting and the first image of something interesting.

Who needs labels