I am doing lifespan assays for ~35 genes simultaneously in C. elegans on FUDR (inhibiting reproduction) plates to avoid re-picking the worms. I tried it once and got positive results, but a problem is that the worms occasionally burrow into the NGM gel, sometimes due to my accidental damaging of the gel, sometimes for no apparent reasons. Now I am trying to do the experiment again and thinking about increasing the agar concentration to strengthen the gel to reduce the chance of worms getting inside.

However, senior members in the lab are not comfortable with the change because the agar concentration of 1.7% is the standard and everyone is doing it that way. The question is, how is the 1.7% agar established? What is the reason for that concentration? Are worms still comfortable in higher concentration, such as 2.5%, of agar? Does that change impact lifespan?

For my lifespan assay, I am not looking forward to comparing with other labs. I am simply comparing two groups of genes (evolutionarily selected by long vs. short lifespans) to see their different impact on lifespans after RNAi knockdown. For me, it seems that the experiment is well controlled and whether the agar concentration is standard is almost irrelevant as long as the worms are not stressed. Given the huge amount of work each day and length of time worms have to be kept on the same plate, I'd rather go for a smooth process without many accidents than sticking to the standard. I also have a very limited period of contract (employment, only a few months), so I hope I do not have to test it out first.

So, if someone has the experience of a different concentration of agar, knows how that impact lifespan assays, or can point to some materials that explain the standard, that will be greatly appreciated.

Thanks!

  • Similar topics
  • Gels
More Jiang-Nan Yang's questions See All
Similar questions and discussions