Wednesday, 18 December 2013

Do 40S and 60S come together sooner than we thought?

The ribosomal subunits 40S and 60S are produced and assembled whilst they are still within the nucleus. However, there seem to be several different levels of control which prevent their formation into a fully functioning 80S ribosome, which is only relieved once they are exported into the cytoplasm. This has often led researchers to believe that 80S ribosomes can be found only within the cytoplasm. However, recent research has brought this theory into question. 

The synthesis of proteins is required for the viability of all living cells, with the code for these proteins contained within chromosomes as a sequence of bases on DNA. For this sequence to be developed into functioning proteins, a number of complex processes must first take place, chief amongst which is the transcription of DNA into messenger RNA (mRNA). This mRNA can then be used to assemble simple amino acids into complex proteinacious structures. However, the process of transcription is not possible without the large molecular machine called the ribosome, which acts as the primary site of protein synthesis.

During the production of a ribosome, the subunits 40S and 60S must join together to form a fully mature and transcribing ribosome (80S), the formation of which acts as a main indicator that transcription is occurring within a cell.  Much previous research has suggested that the 60S and 40S subunits are synthesised in the nucleus through various complex mechanisms, with the 40S and 60S subunits being incapable of associating with mRNA, preventing their proper functioning until they are exported into the cytoplasm. Once in the cytoplasm they are able to form the fully functioning 80S ribosome, which is then capable of transcribing DNA and producing functionally mature mRNA .

There are several levels of control which are capable of controlling each of these components. Studies in saccharomyces cerevisiae indicate that this level of repression can be controlled by nonribosomal assembly factors (AFs) which bind to pre-40S and pre-60S subunits preventing their activation and assembly into the 80S ribosome until they are exported into the cytoplasm. In addition to this, it is also believed that other proteins are crucial for the translocation of these ribosomal subunits through the nuclear pore, the control of which may also be involved in preventing the assembly of 80S before they are required. There are also several lines of evidence which suggest that the 40S ribosome is not processed properly whilst it still resides within the nucleus, further preventing it from forming a mature ribosome, unless exported into the cytoplasm.

However, with a recent study indicating that immature 40S subunits can actually initiate translation whilst still residing in the nucleus, as well as numerous other studies confirming this fact, it is now clear that 40S is capable of interacting with 60S and forming an 80S-like structure, which is able to produce a low level of inefficient translation.

With this information in mind it was the goal of a recent paper by Al-Jubran et al (2013) to fully determine the cellular location of the functional 80S ribosome within Drosophila. To do this a number of ribosomal proteins (RPs) were identified, with their molecular position determined both when in their 40 and 60S subunits, as well as when contributing to the mature 80S structure. If it was found that these RPs are situated closely to each other in the 80S ribosome, then they were tagged in such a way as to make them fluoresce when they are within close proximity to each other. This would give a clear indication of exactly where 80S ribosomes were situated within a cell.

The results from this visualisation indicated that the majority of ribosomes were in fact located primarily in the cytoplasm of the cell. However, there were also a number of signals which suggest that a properly functioning ribosome can be found within the nucleus, with higher levels of intensity found at the nuclear periphery and the nucleolus.



However, these results had to be further confirmed to ensure their authenticity. One such confirmation was gained through treatment with puromycin. Usually, when a cell is treated with puromycin the 80S ribosomes become inactive and non-translating. When these non-translating ribosomes were tagged in the same way, no signal was produced. This indicates that the 80S ribosomes that were visualised without the puromycin treatment must have been actively transcribing, giving further weight to the idea that transcribing ribosomes can be found within the nucleus.

Therefore, this research confirms findings by previous research which indicates the presence of 80S ribosomes within the nucleus, as well as cementing a strong technique that is capable of visualising these ribosomes, in a simple and clear cut way. This research could be further developed through quantifying the level of transcription that is being produced by these nuclear 80S ribosomes. This would make it much clearer whether the transcription of mRNA before cytoplasmic export is essential for the proper functioning of the cell, or whether this process simply occurs at this stage to kick start transcription of all genes. 

Sunday, 10 November 2013

The role of centromeres in the bouquet formation of Tetrahymena thermophilia.

Bouquet formation, homologous pairing and crossing over in early meiosis are all processes that are strongly dependent on the centromere.  

When meiosis is initiated one of the first structures that is seen to form in almost all organisms is the chromosome bouquet. The chromosome bouquet is an arrangement where telomeres bunch together in a confined area of the nuclear periphery with centromeres at a polar position to them, which as its name suggests resembles a bouquet of flowers. This structure allows for the the pairing of homolgous chromosomes within the cell. 



Tetrahymena thermophilia is a unicellular ciliated protist who’s micronuclei elongate and stretch dramatically during its meiotic prophase, which is the point at which Tetrahymenas exaggerated bouquet forms. Loidl, Lukaszewicz, Howard-Till and Koestler at the University of Vienna have released a paper that may help to explain what mechanisms are taking place during this process in the unicellular protist. 

This paper investigates the importance of Double stranded breaks (DSB’s) and centromere function in Tetrahymena’s bouquet formation, suggesting that centromeres have essential functions in recombination and chromosome pairing.
To begin their investigation Loidl et al attempted to understand the function of centromeres during Tetrahymenas nuclear elongation. They did this by constructing strains where the H3 histone Cna1p, was disabled through RNAi depletion. Under wild type conditions where telomeres and centromeres segregate during the formation of the bouquet, centromeres cluster at the periphery of the nuclei.  However with this RNAi mutant immunostaining detected only background staining, with no clear organisation of centromeres.

This paper also discusses how the bouquet arrangement of centromeres and telomeres at opposite poles of the nucleus is highly dependent on in the interaction of microtubules with the kinetochore. It was found that whilst microtubule interaction is the main contributor to nuclear elongation and that centromeres play no role in the elongation of the cell. Microtubules have two known functions in Tetrahymena; to elongate the nucleus and to hold the centromeres at a fixed position of the nucleus. 

It was already known that DSB’s were needed for the bouquet to form, as it is an ATR-dependent response. ATR being an enzyme that is activated in the persistent presence of single stranded DNA, which is a common intermediate for most DNA damage repair pathways.  Loidl et al set out to confirm that the bouquet was actually necessary for DSB repair by adding nocodazole- a microtubule inhibitor – to prevent the formation of the bouquet within the Tetrahymenas nuclei. 
As an additional measure DSB-formation and repair were monitored using pulsed-field electrophoresis in both the control and the nocodazole treated cells. Both of these experiments showed that DSB’s were repaired independently of bouquet formation. Therefore this proves that whilst DBS’s are need for the initial formation of the bouquet, the bouquet itself is not needed for the repair of those DSB’s.

To try and further understand the role of DSB’s and how they regulate bouquet formation Loidl et al created a scenario where DSBs were continuously produced. To do this they treated meiotic phase cells with cisplatin – an inducer of DSBs.
Cells treated with cisplatin were no longer able to exit the bouquet stage, suggesting that the trigger for the cells release from the bouquet stage must be an intermediate stage in the DNA repair.

The bouquet structure is highly conserved amongst a vast number of different species, but it’s function varies
slightly from organism to organism. For example in Arabidopsis, like with Tetrahymena, telomeres are linked with the nuclear periphery. However, in contrast to Tetrahymena the centromeres do not cluster at a single point at the opposite pole to the telomeres, but are more dispersed randomly across the nucleus with no evidence to suggest that they are involved in homologous chromosome pairing or recombination. Instead chromosome pairing occurs in zygotene, when a structure loosely similar to the bouquet is formed.  In mammals and budding yeast chromosome pairing has been proved to be led by telomeres and dependent on the protein SUN1 which anchors the telomere to the nuclear membrane. A similar protein of which could not be found by Loidl et al for Tetrahymena.

The next step in the investigation of Tetrahymenas exaggerated bouquet is a more in depth look at the function of telomeres and telomere associated proteins. By doing this there will be either confirmation of the views that this paper has put forward or give us a better insight into the function of telomeres allowing us to appreciate their involvement in the bouquet forming process.

By understanding Tetrahymenas bouquet completely and fully we will be able to apply this knowledge to other organisms to help us determine the processes that govern their bouquet formation. This would be especially important is we could apply this new knowledge to ourselves and the processes that go on in our cells during meiosis. This would give us a much clearer insight into how diseases and disorders may form, and therefore give us clues on how to prevent these diseases. 

Pretty soon our food is going to run out – but what can we do about it?

The human population of the earth has been increasing exponentially over the past few hundred years, a trend which is showing no sign of stopping. But with all these extra mouths to feed, do we have the resources to actually keep feeding them?

One of the biggest problems that the human race faces at the moment is the rapid development of global poverty and global hunger, issues which will only be worsened if the population increases to nine billion, as predicted.  Whilst these are two separate problems, they can be simultaneously resolved in a long term way through the promotion of agricultural growth in areas where poverty is particularly rife. However, communities which experience particularly harsh levels of poverty usually occur in areas whose land is incapable of sustaining a large number of crops. These areas therefore require new types of crop species which can grow with a much lower level of nourishment, whilst still producing the same yield of food.

Therefore, research geared towards producing these new varieties of plants is of critical importance. The main method which could be used to produce these variants is through increasing a plants level of genetic recombination.

Recombination is a molecular process which occurs within the cells of plants during meiosis, a specialised round of cellular division which produces gametes, cells with half the usual number of chromosomes (haploids). During this time, chromosomes which are genetically very similar to each other called homologues pair together and form cytological structures called chiasmata. When these chromosomes then resolve during a stage called anaphase, the resultant chromosomes often contain pieces of genetic information from each of the chromosomes which originally pair (recombinants). This introduces a level of genetic variation within a population, which is often the driving force behind evolution, and the adaptation to different environmental influences.

This process occurs in all sexually reproducing animals. However, in certain plant species recombination is kept under incredibly strict control in an attempt to ensure the stability of their genome. Whilst this is a positive outcome for these plants in their natural environment, when attempting to produce variants with more resilient phenotypes this produces a difficult obstacle. It is therefore the aim of many researchers to further understand the mechanisms which govern recombination, as well as any techniques which could be adapted to try and artificially induce much higher levels of recombination.

It is this type of research which I am currently involved in whilst completing my masters at The University of Birmingham, UK. During my time working in this lab I will be attempting to determine whether okadaic acid, a phophase 2A inhibitor, is capable of inducing a much higher level of recombination in Brassica napus between chromosomes which normally don’t recombine at all.



This type of research is much different to previous research which has produced genetically modified (GM) crops, which usually involves placing foreign genes into an organism which would never have been present in the wild. This type of technique is often poorly favoured by the public at large, with many suggesting that the repercussions of manipulating nature in this way could never be fully understood.

However, the research that I am involved in is interested simply in inducing the expression of genes that were are already present within the genome, but were never allowed to surface and influence the phenotype. This is a much more environmentally safe method of producing high yielding plants, and one which would be a globally accepted resolution to current issues around poverty and food security.

Could this type of research be the answer to some of the big questions that we are going to have to face in the near future? Whilst it certainly has potential, we are still a long way away from being able to completely know the truth. But don’t worry, I’ll keep you up to date if I ever find out the answer to my tiny scope of research, and let’s just hope that the hundreds of other labs around the world do the hard work for us.

What do you think about this research? Comment below with any thoughts or questions. 

Thursday, 7 November 2013

Volunteers required for whole genome sequencing – Will you participate?

Today the UK has launched a personal genome project, urging 100,000 people to contribute their genetic information and have their genome sequenced to be put on public record. Is this a big step in the advancement of sequencing huge sample sizes? Or is making our entire genome available  to anyone online a step too far?

Our DNA is what makes us who we are; a sequence of bases in each and every one of our cells which contains the code to making what we see in the mirror every morning. It is possible for someone to look at your DNA and determine almost everything about us; the colour of our hair, our eyes or any genetic diseases we may have or be prone to, without ever having actually laid their eyes on us.



Whilst it would have previously been near impossible for many people to gain access to our DNA, a new initiative has been set in place by The Personal Genome Project UK (PGP-UK) urging 100,000 volunteers to donate their genetic information for their genome to be sequenced. This has already been undertaken by other countries, including America in 2005. However, this is the first time a project like this has been attempted in the UK, and many sceptics have their doubts.

The aim of those conducting this project is to accelerate researchers understanding of genes, both normal and defective, as well as how different environmental influences can affect those genes. However, the majority of companies sponsoring this research (one of which being google) are hoping to use this data for commercial exploitation, to specifically target advertising for drugs that each of us may need, based on our genetic code.

This is one reason why this project in controversial, but many also have issues with the broad spectrum of individuals who will be able to view your genetic code. Whilst names and addresses will not be included on record, the research group involved has warned that the security of participants is not guaranteed, and they could potentially be identified.

Because of this, a number of tests are included in the application of those who are applying to be involved, in which only a score of 100% will be accepted, which is designed to ensure that all those involved fully understand the potential risks.

If you were decide to join the project, and then be accepted, you will receive a kit to take cheek swabs, and also asked to attend a clinic to provide more extensive samples, with your genetic information published within a month.

It is hoped that this sort of extensive record of so many peoples genomes will allow for a number of major diseases, which have a large effect on public health, to be linked to genes which have previously been unidentified. This would be crucial information for those researching therapies for these diseases, and could dramatically advance our understanding of them.

Do you think you’d be interested in taking part in this project? Or do you think that having this kind of information about yourself on public record is a step too far? I for one know I’m definitely going to be signing up. 

Saturday, 26 October 2013

Money doesn't grow on trees - or does it?

A recent paper published in nature communications has given evidence that Eucalyptus trees may be capable of absorbing deposits of gold from within the earth. At a time when new gold discoveries have fallen dramatically, could this be an answer to finding them?

New discoveries of gold have fallen by 45% in the last 10 years. A large amount of gold may be found deeper within the earth beneath sediments, but locating these deposits has proven incredibly difficult. One new technique that is currently in development is called biogeochemistry, which is essentially the use of biological systems (for example plants) to determine which minerals are present in the soil they grow in, through observing the concentrations of minerals which can be found in the plants themselves.

However, there are a number of problems with using this type of data, chief amongst which is the fact that gold concentrations within plants is usually incredibly low, with no unequivocal evidence suggesting that the concentration of gold in plants has any correlation to the amount of gold in the soil in which they grow. It was therefore the aim of this research to provide more solid evidence for this theory.

To do this, researchers observed the activity of Eucalyptus trees which were known to be growing above a gold deposit, buried beneath a thick layer of other sedimentary minerals.



Whilst it has previously been shown that gold particles are present around the soil of Eucalyptus trees, this research, with the use of the Australian synchrotron (a machine which uses X-rays to view matter in vivid detail) was able to provide evidence for the presence of tiny amounts of gold in the leaves, twigs and bark too, proving that the trees were actually absorbing this material through their roots buried deep beneath the earth.

This could be an important discovery in the mining of precious minerals, and even those that aren’t so precious. Normally, to find a deposit of ore, extensive exploratory mining would have to take place which would usually result in a dead end. This is both expensive and invasive to the environment that sits on top of the ore. However, if this new method of detection could be developed more extensively, all that would be required to locate what we were looking for would be a sample of leaves or twigs from the vegetation in the area. These samples would then be able to tell us exactly what was within the soil, and whether more extensive mining should take place.

What do you think about this new discovery? Can you imagine being able to pop into your back garden one day and being able to tell exactly what was in the soil by looking at just one leaf?


Comment below with your thoughts and questions and don’t forget to +1 and reshare if you enjoyed this article. 

Thursday, 24 October 2013

Is a tumours micro-environment a source of innate resistance to anticancer drugs?

The RAF-MEK-ERK pathway is often mutated in a number of cancers, causing signals for cell proliferation and survival to be relentlessly activated. Various inhibitors of certain components in this pathway have been developed, but with little efficiency. It is believed that the micro-environment of a tumour can confer a level of resistance to some cancer therapies through the secretion of HGF, a growth factor produced by stromal cells.

Each new development of a treatment against cancer is met with difficulties. This study by Straussman et al focuses on the RAF-MEK-ERK pathway. This is a pathway through which extracellular signals are transduced into intracellular signals, through interaction with extracellular receptors. These signals cause the expression of transcription factors which regulate the synthesis of genes required for cell survival and proliferation, key genes when considering the formation of a cancer.

Previous research has targeted the RAS protein, a component of the RAF-MEK-ERK pathway, with unsuccessful results. This has led to research directed at the kinases downstream from RAS. It is one of these downstream kinases investigated by Straussman et al, in the form of RAF and its potential inhibitors.

RAF inhibitors (RAFis) work by interfering with the RAF protein in the RAF-MEK-ERK pathway, preventing this pathway from transducing the signals for increased proliferation. It has previously been seen that inhibiting the mutated RAF reduces cancerous growth. However, these types of responses are almost always followed by a re-emergence of that tumour, brought on through the formation of resistance. Here it is suggested that the tumour microenvironment may be conferring that resistance through the secretion of soluble factors.

Whilst the role of the microenvironment in growth and metastasis is well documented, only recent research has suggested its function in drug resistance. In order to test the microenvironments role in tumour drug resistance, Straussman et al began by developing a co-culture system. In this co-culture system, GFP–labelled tumour cells were cultured alongside stromal cells to assess modulation of drug sensitivity. This was quantified by measuring how levels of GFP changed over a set period of time. This test resulted in the observation that, when this co-culture system was exposed to RAFis, those RAFis were frequently rendered ineffective when cultured alongside stromal cells.

 
Strausman et al then investigated the effect of one RAFi in particular (PLX4720). To do this they tested the ability of stromal cell lines to provide 7 mutant BRAF (V600E) melanoma cell lines with resistance to the anticancer drug. This resulted in six out of the seven developing resistance to PLX4720.



It was therefore concluded that stromal cells can render certain anticancer drugs ineffective.  Straussman et al confirmed that soluble factors secreted from stromal cells were responsible for the formation of resistant tumour cells. This conformation was important, as it allowed Straussman et al to identify the exact resistance causing factor. To do this they conducted an antibody-array based analysis of a large number of secreted factors. This allowed them to compare the conditioned medium obtained from the previous 6 stromal cell lines that developed resistance to PLX4720, with stromal cell lines that had not exhibited any sign of rescue activity.

From this, HGF, a growth factor that plays a role in activating the receptor tyrosine kinase MET, was identified as the source of rescue. HGF is capable of restarting this pathway through activating MEK, bypassing the RAF component of this pathway.  Straussman et al then confirmed the presence of HGF in a number of patients being treated with a RAFi, as well as confirming the phosphorylation, and therefore activation of MET (See Figure).
In these studies it is also predicted that the presence of stromal HGF in patients is a form of innate resistance, with patients capable of producing HGF showing a much poorer response to treatment than those unable to produce it.

However, further evidence was required to fully confirm that the presence of HGF was the cause of resistance. To collect this evidence, Strausmann et al tested whether recombinant HGF was capable of inducing resistance upon tumour cells, whilst simultaneously testing whether HGF-neutralising antibodies blocked resistance to PLX4720. This confirmed that HGF was indeed capable of producing the resistant phenotype.

Could these results have a clinical impact on the treatment of cancer?

These results are important clinically in defining why certain cancer treatments aren’t always effective, as well as identifying where research should be taken to combat this resistance. This paper can be compared to those investigating sorafenib, a molecular inhibitor of a number of protein kinases which has been approved in the treatment of kidney and liver cancer.

Future developments in this field should focus on whether the formation of resistance can be blocked through inhibition of HGF, as well as identifying a time scale on how long stromal cells take to confer resistance to this treatment. If this time scale can be determined, a treatment could be developed which involves combining therapies at specific times to amplify their effectiveness. It may also be important to investigate further whether other RAFis are deemed ineffective by HGF, which could make generating second generation inhibitors important. However, it may be more prudent to investigate whether the activation of MEK, ERK or MET can be inhibited. This would have the same effect, but because inhibition would be taking place further along the pathway, there is less likelihood that resistance will form.

As mentioned by the authors, further research should also take place into investigating whether this type of resistance has a role against other anti-cancer drugs, as this may give us crucial information in combating against them.

What do you think about this research? Comment below with your thoughts and questions and don't forget to +1 if you enjoyed this article. 

Friday, 20 September 2013

Tom and Jerry: A friendship fueled by parasites?

Parasites have been shown to permanently remove the innate fear that mice have towards cats, which could give us key information in the treatment against schizophrenia. 

For as long as there have been mice, there have been cats to chase those mice around. Neither of them can help it, it's in the cats nature to chase, and the mouses to be chased. It's built into the mouses brain at a cellular level to fear cats, and run from any sign of them. This is a very important reaction for a mouse to have, as any other reaction would probably result in their untimely demise. 

However, recent research has suggested that the parasite Toxoplasma, which is a very common infection in humans, may be able to alter the brains of mice and eradicate their innate fear. Not only this, but it has also been shown that the effect of Toxoplasma is retained in the mouse long after the infection has cleared, suggesting that this change in the mouses behaviour is permanent and causes structural changes within the brain. 

An infection by Toxoplasma induces a disease called Toxoplasmosis. Whilst most mammals and birds can be infected by this disease, the fear eradicating side effect seen in mice is totally unique. 

Now, one of the biggest questions is obvious, why? Why would a parasite want to make its host less scared of cats? This actually happens so that the parasite can compete its life cycle. The only place that Toxoplasma can sexually reproduce is inside the cats intestines, and the only way it can get there is for the mouse that it's living in to be eaten. 

This could help put the cartoon Tom and Jerry into a brand new perspective. Whilst we all thought we were watching a cat and a mouse running around in a bitter rivalry, what we were actually witnessing was just one stage in a parasites life cycle who had evolved over millions of years to fill in that one particular niche. Who'd have thought it ay? 



It's all good and well having this very interesting piece of information, but in what ways, if any, can it relate to us as humans to help us understand ourselves?

Well, it turns out that this parasite is also prevalent in a large number of patients with schizophrenia, one of the symptoms of which is an increase in the levels of neurotransmitter dopamine. It is thought that the parasite can induce this increase by forming microscopic cysts that grow inside a number of brain cells, which increases their production of dopamine. In actual fact, the treatment against schizophrenia usually involves treatment against this infection. 

However, because this loss of fear in mice is persistant and retained long after the infection has cleared, the changed induced by Toxoplasma must occur before cysts are formed, and must adapt the brain at a very basic level. This calls into question the theory of cysts increasing dopamine release being the cause of this behavioural change, potentially nullifying treatment against schizophernia that target cysts. 

As with most research, this has raised more questions that it has answered. However, it has given us more crucial information on schizophrenia, and allows us to take one more step toward the effective treatment of this disease. 

What do you think about this piece of research? Write below with your comments and questions and don't forget to +1 and share if you enjoyed this article. 

If you're particularly interested in this research then you can find the original paper here.

Wednesday, 18 September 2013

Mystery solved as to why flies are so hard to catch.

Research produced by both Trinity College Dublin and the University of Edingburgh has shown that the way animals perceive the passage of time around them is linked to how active that animal is within its environment.  

In the past research has shown that the characteristics of an organism is limited by the size of its body and its metabolic rate. However, this more recent piece of research has also shown that an organisms ability to perceive its environment is equally important in limiting how well it can exploit its environment. 

For example, a species which is capable of quickly identifying a threat will be able to survive for much longer than one which is much slower at processing that sensory information. 

To explain this, researchers conducted experiments which found that the rate at which time is perceived varies dramatically between animals. Those who are much smaller and have much faster metabolic rates (e.g birds or flies) perceive much more information per unit of time, and therefore see time passing much more slowly than larger animals with slower metabolic rates (e.g the turtle). 

This information was gained through the use of a phenomenon called the critical flicker fusion frequency. This phenomenon determines how many flashes of light an organisms can perceive per second before the light source is perceived as constant. This is actually the principle behind things like television screens which produce a constant image through a series of flashing lights, but so quickly that we cannot perceive them individually, and instead see them as a constant image. 

This phenomenon is also used to explain how animals have varying perceptions of time, and shows that animals that we would expect to be agile and fast moving are able to see time at a much quicker rate. This also explains why flies are so hard to catch, as they can see your hand moving towards them in slow motion, making it easy for them move out of its way with ease. This could also shed some light on how Neo managed to dodge all those bullets in The Matrix. 



On the flip side of this, there are also some species of tiger beetle who's body moves much faster than its eyes can process and has to stop every now and then to take in the new surroundings that its charged itself into. 

This information isn't only important for the information that we can gain from the complexities of the animals around us, but could have have some implications in human biology in the future. 

At the moment, the limit of a humans sensory perception is being pushed by people like Lewis Hamilton. When driving an F1 car, Lewis is moving at pretty much the limit of his biological abilities. If he were to move any faster his eyes wouldn't be able to take in his environment in enough time for him to react, which probably wouldn't result in a very pretty scenario. 

The only way for us to push ourselves past these limits would be either through drugs, which don't exist yet,  or the adaptation of our eyes at a cellular level. This might seem pretty unlikely but who knows, one day we could all be dodging bullets and trying to prevent the destruction of Zion. 



What do you think about this research? Comment below with your thoughts and questions and don't forget to +1 this article if you enjoyed it. 

If you're particularly interested in this research you can find the original research article here.

Sunday, 8 September 2013

Are you taking the piss?

7 years, 20 researchers and a whole lot of pee has allowed for a much more accurate chemical composition of urine to be determined. This has a number of knock on effects throughout scientific research, especially in the identification and treatment of medical disorders. 


Urine is a beautiful thing; available in every shade of yellow, sterile, chemically complex and one of the most ready available biofluids in the world. However, because of urine's vast complexity, it has been difficult to fully understand each of its components, and what those components can tell us about the person who supplied it. 


Urine usually contains metabolic breakdown products from the food that we eat and drink, contaminants we absorb from the environment as well as the by-products of certain bacteria.  The issue is the amount of information that we have about each of these components, which leaves us with a gap in our scientific knowledge on this substance. 

To try and remedy this gap in our knowledge, researchers at the University of Alberta set out to improve their knowledge by conducting an extensive and intense period of research which would provide a quantitative characterization of urine. Doing this involved employing NMR spectoscopy, gas chromotography, mess spectrometry and high performance liquid chromotography. 



Through the use of these technologies over 3000 previously unknown urine metabolites were identified. It is important for us to understand these metabolites, as they are often the products of a large number of different processes in the body, and can therefore give us an incredibly clear understanding of an organisms phenotype, from a source which is incredibly easy to obtain (pee). 

Now it might sound like that's an awful lot of effort to go through just to further understand our pee. However, this research will have massive implications on healthcare in the future. A persons urine can not only tell us a huge amount of information about someones health, but also their diet, what they drink, drugs they are taking and potential pollutants that they may have been exposed to. This can allow a physician to quickly identify a number of things about a patient through analysis of their urine, giving us the potential to save numerous lives through the identification of things like disease metabolites, allowing for quick treatment saving precious time.

What do you think about this research? Will it make you look at your pee in a different way from now on? Comment below with your thoughts. 

If you want to know any more about the experiments the team undertook then you can find the original research paper here.


Wednesday, 10 July 2013

Did Humans and Neanderthals speak to each other?

What’s the first thing that comes to mind when you hear the word Neanderthal? If you’re like most people you probably muster up an image of lumbering grunting brutes who are just less developed versions of ourselves. You wouldn't be blamed for thinking that either, it’s the image that has been engraved into our psyche since their discovery almost 200 years ago. However, in the last 10 years our perception of Neanderthals has changed considerably, and we now know them to much smarter than we ever gave them credit for.

The main reason for this change in our perception is the availability of ancient DNA, which is capable of giving us a much clearer and concise image of how the Neanderthals lived, and what they were capable of.
The latest piece of information to be gained in this way suggests that Neanderthals may have been capable of producing much more complex speech than we previously believed, whereas before we would have assumed that the most they were capable of were primitive grunts.

This has put a spanner in the works of most language experts who have always stuck to the theory that our ability to speak came about incredibly quickly, and due to very few mutations in our genome. If this were the case, it would mean that speech began development only around 50,000 years ago.  However, this new evidence means that the development of language must have happened over a much longer period of time, and originated more than a million years ago through a combination of genetic and cultural influences.

This could give us some big clues about the interaction between Neanderthals and our ancestors, who coexisted together for a long period of time. Could humans and Neanderthals have communicated with each other? Many assume that we out competed the Neanderthals, and that’s why they died out. However, there have been suggestions over the years that Neanderthals and humans interbred with each other, a theory supported by this new revelation.  

Can you imagine if Neanderthals were around today, with us being able to chat with them whenever we liked? Texting and emailing and calling whenever we pleased.



It would be a weird world, but one that I’d definitely enjoy living in.


What do you think about this story? Write below with your comments or questions and sign up for the mailing list at the top right of this page for notifications of whenever I leave a post. And don’t forget, if you enjoyed this post please +1 and reshare! 

Sunday, 7 July 2013

Spiders use electrostatic attraction to suck in their prey.

It has been shown that a spiders web is attracted to electrically charged objects, a fact which could help in the capture of its prey.

I’m not going to lie, I hate spiders. I feel like they’re always planning something, sneakily running around, making webs in corners and acting suspicious. And let’s be honest, who really needs that many legs?
Well it turns out my suspicions may have been founded, as new evidence has suggested that spiders are smarter than we give them credit for.

It has been shown that spider webs are attracted to objects that are electrically charged, causing the threads of the web to distort towards each other. This is important when you consider that some insects such as bees are capable of generating an electric charge when they flap their wings – causing the whole bee to be seen as electrically charged.  

That means that positively charged insects would only have to fly near a spider web, and they could be sucked in towards the web which deforms around them. This deformation dramatically increases the likelihood that they will be trapped and that the spider will get its next meal.

This is a really interesting piece of research, and could give us some insight into how we could adapt other materials to be of benefit to us. What isn’t known however is whether all insects are capable of producing a charge when flying, which could be the next stage in this investigation.

This is usually the point where I put up a picture of a spider. However, I’m going to skip that s
tep for this post, mainly because if I have to upload a picture that means I’ll have to look at a spider, something I try pretty hard to avoid, especially after knowing how sneaky they are.


What do you think about this discovery? Write below with your comments or questions and sign up to the mailing list on the right of this page to get notifications of whenever I leave a post. 

Saturday, 6 July 2013

Moths send sonic blasts from their genitals.

Hawkmoths have been shown capable of producing ultrasound as a defence against bats. This on its own isn’t unique, what is unique is the source of that ultrasound – their genitals.

Have you ever casually glanced down at your genitals and wished that they had more functions? Well if you have this story might make you pretty envious. That’s because recent evidence has suggested that Hawkmoths (found mainly in the tropics) are capable of using their genitals to produce a loud beam of ultrasound.

Whilst I wish there wasn’t any purpose to this, there unfortunately is.




For millions of years, bats and moths have been competing against each other in an epic battle against one another. Each have been adapting and evolving in an attempt to outdo the other, and this genital sonic blast seems to be just the latest in a long list of adaptations.

Whilst the purpose of this behaviour hasn’t been fully confirmed, it was observed that the moths produced this ultrasonic sound whenever bats approached. It can therefore be assumed that the hawkmoth’s ultrasound either gives a warning to the bats to stay away, or is capable of jamming the sonar that the bats use to visualise their environment, thus preventing the bats from ‘seeing’ them.

This is a great little discovery, and just goes to show how far evolution can push a species, and the methods that they use to defend themselves.

If you’ve like to know more about how prey and predator both evolve alongside each other, then read about the Red Queen Hypothesis here.

Also, just in case any of you have ever played Pokemon, yes, one of venomoths moves was supersonic. Maybe we should look to Pokemon for more future discoveries about animals?


What do you think about this discovery? Write below with your comments or questions and sign up for the mailing list at the top right of this page for notifications of whenever I leave a post. 

Friday, 5 July 2013

New method of bacterial communication discovered.

Antibiotic resistance is one of the major health risks of the modern age, and could set our health care system back by decades. Much research has been geared towards preventing this problem, including the use of silver which I described in a recent post which you can find here

New research may have given us the foothold that we need in the fight against antibiotic resistant bacteria. This research suggests that resistant bacteria within a population are capable of communicating with bacteria that are less resistant, through chemical communication using small amino acid molecules. This can then be used in turn, to make those bacteria resistant, spreading that resistance throughout the population. 

These small molecules can also be produced by almost all forms of bacteria, which suggests that they are a form of universal communication. Therefore, if we were able to block these small amino acids, we may be able to prevent the rapid spread of resistance between groups of bacteria. 

What do you think about this research? Write below with your comments or questions. 

Building a heart from scratch: A how to guide.

A few weeks ago I wrote a post called ‘How can you mend a broken heart?’ which you can find here, and now i find myself being able to talk about the extraordinary feat of actually building one from scratch.

There has always been a shortage in the amount of heart transplants available for those patients who need them, mainly due to damage through illness or resuscitation attempts. Whilst some efforts to resolve this problem have involved trying to get more people to donate their hearts after they die, one lab in America is attempting to circumvent this problem completely, by growing them from scratch instead.

It’s an easy enough thing to say, but in practice the growth of whole human organs has proven incredible difficult. Some success has been achieved with the growth of the most simple, hollow organs like the trachea and the bladder, but the amount of coordination required between dozens of different cell types within a complex organ makes it near impossible.



Because of this, researchers quickly realised that they would need an already functioning biological scaffold to build their heart around. That scaffold came in the form of a different recently deceased heart which had been stripped of its cells (decellularization). This left behind only the supporting extracellular matrix and collagen which could then be repopulated by new cells (recellularization). Now, that might not seem to make much sense. Why would you take an already functioning heart and strip it of all its cells, only to then later replace those cells? To answer that we need to think about the main problem facing tissue transplants, which is tissue rejection.

If any of you have ever watched ER or House, and a patient in need of an organ transplant is being treated, you’ll probably have heard the phrase ‘is there a tissue match?’ That means that for an organ to be transplanted into a patient, that organ needs to be similar enough to the patient so that their immune system doesn’t reject it. This results in a lot of people who need an organ being left without, due to the fact that any available organs aren’t a ‘tissue match’.

With this new method of organ growth, tissue rejection isn’t a possibility. This is because of the source of cells that are used to regrow the heart tissue – the patient needing the transplant.  By using adult cells from the patient, and reprogramming them into thinking that they’re stem cells (turning them into induced pluripotent stem cells or iPS cells), the cells of the new heart will be an identical genetic match to the patient, ensuring that the new organ won’t be rejected.

If that were the end of the story then we would have solved the problem of organ shortage and a lot of people would be much better off. However, in science things don’t always go to plan. It might be easy to say ‘let’s use stem cells to regrow the heart’, but in reality the number of factors that contribute to the growth of an organ are exponential, all of which need to be recreated to grow an organ in the lab. Cells are able to sense the environment around them, including the pressure being forced upon them from other cells, the beating action of a functioning heart, as well as the nutrients and oxygen around them. To try and recreate these conditions, researchers placed the heart in a bioreactor that attempts to mimic the hearts natural environment. This provided some positive results, with organs being transplanted into rats and fully functioning, even if only for a short period of time.


This marks a monumental step forward in our ability to grow organs that are specifically tailored to the needs of a patient. However, there are some sceptics.  Many believe that there are too many complications in the field of tissue regeneration for this to ever be a viable source of organs, and that the most we can ever hope to achieve is the generation of parts of an organ, such as lobes of a kidney or valves of a heart. Well, let’s just hope that they’re wrong. 

What do you think about this research? Write below with your comments or questions, and join the mailing list at the top right of this page to get emails every time I leave a post. 

Tuesday, 2 July 2013

New research could take your breath away.

A new method has recently been developed that has allowed the visualization of the lungs of asthma sufferers.

Asthma is a disease which affects the airways in our respiratory system that carry the air in and out of our lungs. During an asthmatic attack these airways become inflamed and contract, resulting in their narrowing and thus preventing air from passing through them.

 Lots of you reading this would have experienced what this can feel like – the  tightness in your chest? The coughing and wheezing? It’s something that sufferers always dread, and always has us clutching for our inhalers.  It affects millions of people around the world and there is currently no cure. However, the development of a new technology could give us a much clearer insight into this infliction.

This new method requires asthma sufferers to inhale the harmless gas helium-3, and then be scanned by an MRI machine. The helium-3 can be visualised by the MRI scan, with an image produced like the one below.



The coloured areas represent parts of the lung where air can easily permeate, with black areas indicating portions of the lung where air cannot reach. In healthy patients the whole of the lung can be visualised. However, in patients with asthma the amount of black on the scan is much higher, giving us a much clear image of which areas of the lung are most affected.

This method gives us much more information of asthma, and could one day help develop the long awaited cure for asthma.

What do you think about this research? Write below with your comments or questions, and sign up for email notifications of whenever I leave a post at the top right of this page.


Email – newsinscience@gmail.com with any questions or post suggestions. 

Sunday, 30 June 2013

Rats with spinal injury get control of their bladders back.

Scientists in America have made breakthrough in repairing the damaged spinal cords of rats. 

Somewhere in a lab in America right now, every single day is dedicated to giving rats control of their bladders. Some people might look at that and think that their work is unimportant, but in actuality, they couldn’t be farther from the truth.

Let me explain. For decades now labs across the world have transplanted nerve cells into paralysed animals in an attempt to return some of their function by connecting two areas of disconnected spinal cord. However, trying to get these cells to connect to each other has proved difficult, mainly due to the scar tissue which forms on damaged tissue which prevents cellular regeneration.

That is, until now. Successful results have now been gained from scientists in America who have performed excruciatingly complex surgery on paralysed rats to transplant nerves to fix the gap in their spinal cord. This, in combination with a cocktail of injections and chemicals which breaks down scar tissue and encourages cellular growth, has resulted in injured nerve cells kick starting their own growth for the very first time.

As you can tell from the title of this post, the rats didn’t miraculously jump up and start dancing about. However, what they did do is regain control of their bladder. Whilst on its own that might not seem like much, when you consider what this breakthrough means for future studies, the knock on effect could be astronomical, giving paralysed patients back a piece of their dignity.




The next step in this research will be recreating this technique in larger and larger animals, without any sign of negative side effects, until eventually it can be used on humans.

Write below with your comments and questions, and sign up to the mailing list at the top of this page to get notifications  of whenever I leave a post.