NetNotes
do wonders for catching changes in the system before they become major issues. Quite oſten I’ll even include a quick validation of the control calibration data in the analysis code, as a computer tends to be much better at catching slight changes over time than the human eye (especially considering our built-in biases). As such, I can tell you from a recent project that the lateral chromatic aberration and axial point spread function did indeed slightly driſt over time on a commercial core microscope during the couple of months we collected our data for a super-resolution imaging experiment. Tat said, I can also sympathize with having a complex, finely tuned experiment getting thrown into the wringer, especially if the experiment is producing exciting results. Assuring the users that most any re-calibration should be a simple linear transform of the previous configuration should relieve some of their stress. Ben Smith
benjamin.smith@
berkeley.edu
Of course, PMs and service calls are extremely important for all
the good stuff. We rely on them for operation and learning more about the instruments. However, sometimes we have problems. Increased laser power is not a problem; it’s always been a good thing for us, especially for people doing FRAP, but as already noted, we need to know what the laser powers are. Te biggest problem we have aſter PMs or other service calls is that settings have been changed during the service. For instance, a lens position has been reset to a calibration lens and not changed back. Or the properties that are applied to the Reuse command are changed. Or camera colors are swapped. Essentially, there isn’t a checklist for resetting states. Soſtware updates with change functionality, which sometimes includes elimination of functionality or buttons being moved, are another issue. Michael Cammer
michael.cammer@
nyulangone.org
For the settings there are values in the database. We have these
values back to the uncalibrated settings for a particular user so that they could continue with their existing experiment. We then reset the values back to the calibrated values for new users. Your service engineer should be able to show you where these values are in the database. Brian Armstrong
barmstrong@coh.org
We shoot a dim, fiber-coupled tungsten or tungsten-deuterium
lamp into the objective at low gain and record the spectrum. Te lamp is stabilized and gives a constant spectral power density, so you can use it to monitor the stability of a detector over time. Additionally, if the system is “recalibrated” then you can compare the previous lamp spectrum to the post-recalibration spectrum to get a transform to safely compare data acquired pre and post. Our lab has used spectral imaging extensively for over a decade and it seems like only now other labs are realizing the full advantages and disadvantages of the technique. I also feel that spectral detection was unleashed on the user base by the various vendors without any real consideration for long-term stability or calibration. I have found that most systems driſt considerably over time and that the users need to monitor this for anything quantitative. Craig Brideau
craig.brideau@
gmail.com
It would be interesting to know if the 7-color samples give
significantly different results when unmixed with the 2 different calibrations and reference spectra. Have they saved the original, not unmixed data? Tis raises 2 questions: 1) it points to the importance of defining ‘original’ data in microscopy (that we are supposed to store for 10 years aſter publication). Is it enough to save the unmixed data? Is it enough to save the data before unmixing without saving the instrument calibration data? 2) if the same sample, unmixed with the 2 calibration/reference spectra, show different results, and that influences the interpretation so that the conclusion changes, one can argue that
78
the experimental design might have been flawed from the start, that is, the instrument “noise,” in a wide sense, was not considered in the interpretation of the result. Re-acquiring the ref spectra seems wise. Sylvie Le Guyader
sylvie.le.guyader@
ki.se
I would echo Sylvie’s comments: do you see a difference in results
between the same fluorophores before and aſter the service with the two correct reference spectra sets? My feeling is you shouldn’t, and it should correct for the weightings. To answer your specific questions, I am well aware they need recalibrating now and again. I had problems with one of mine and had to get an engineer to recalibrate it. And yes, I inform users that they should recapture reference spectra. I think as part of the PM service, both the grating positions and normalization of the detectors are calibrated to one another. I have seen differences between detectors in the array even aſter service, and in a way, I prefer the poor man’s version, using one detector and a sliding dichroic. So, I think if you see differences before and aſter service, it is more likely to be the normalization of the PMTs rather than the spectral positioning (if you use new reference spectra). Regarding the QC of the system for reproducibility, this is far trickier. I think that taking a well-defined and stable spectrum to show that the windows are where they should be is about all you can do, as several others have suggested using an acetate slide or stable excitation source. A word of warning if your users are trying to quantify intensities from such data; the algorithm they use, or at least the ones I read up on when the LSM meta heads first came out (I think it is still the same), essentially splits the data in each pixel to the reference spectra with weighting. I struggle to see how this can ever be as quantifiable as simple intensity data from a PMT (yes, I know that is another argument, but this is adding another layer of variables on top!); and the weighting is always going to be different between samples if they have been unmixed with different numbers of spectra or with/without residuals, so this should always be consistent too. Glyn Nelson
glyn.nelson@
nclac.uk
Tank you everyone for the input on the spectral detector
calibration. We have two groups using the system. One for 7-color OPAL staining and one for Teal/Venus and GFP/mRFP FRET sensors. Tey will both redo the reference spectra in the new year for their new data. For the OPAL staining they had noticed some crosstalk aſter the calibration (using the old reference spectra). For the FRET, the reference spectra were collected with the black level set incorrectly. Each PMT has a different offset and some were fine, but others were clipping the data. Ten the spectra were collected correctly. Finally, they will have to collect new spectra with the recalibrated system. I think we will take some time and look at things carefully and perhaps write up a technical note. Te user will store the data with the appropriate reference spectra so they could be unmixed in the future and the data can be reproducible. Te Airy Scan discussion is interesting too. Maybe a new QUAREP-LiMi working group! Claire Brown
claire.brown@
mcgill.ca
Back Illuminated sCMOS vs EMCCD Cameras Confocal Listserver We are thinking about replacing some ∼10-year-old iXON 897
cameras on a single-molecule two-color, two-camera TIRF system. I’ve seen some comparisons between the earlier sCMOS cameras and EMCCDs. I would like to get the opinions of people on this list as to which camera would be the best for single-molecule imaging. Te single gfp’s we image by TIRF will be diffusing and not stationary. Many of the comparisons are for localization microscopy where the target is stationary. We are particularly interested in examples of diffusing single fluorophores. Any experience or thoughts are appreciated. Tanks, Jeff Spector
jospector@gmail.com
www.microscopy-today.com • 2021 May
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60 |
Page 61 |
Page 62 |
Page 63 |
Page 64 |
Page 65 |
Page 66 |
Page 67 |
Page 68 |
Page 69 |
Page 70 |
Page 71 |
Page 72 |
Page 73 |
Page 74 |
Page 75 |
Page 76 |
Page 77 |
Page 78 |
Page 79 |
Page 80 |
Page 81 |
Page 82 |
Page 83 |
Page 84 |
Page 85 |
Page 86 |
Page 87 |
Page 88 |
Page 89 |
Page 90 |
Page 91 |
Page 92