Field of Science

Showing posts with label experiment. Show all posts
Showing posts with label experiment. Show all posts

The devil under the hood: To look or not to look?

Modern biology and chemistry would be unthinkable without the array of instrumental techniques at their disposal; indeed, one can make a case that it was new methods (think NMR, x-ray crystallography, PCR) rather than new concepts that were really responsible for revolutions in these disciplines. The difference between a good paper and a great paper is sometimes the foolproof confirmation of a decisive concept, often made possible only by the application of a novel technique.

Yet the onslaught of these methods have brought with them the burden of responsibility. Ironically, increasing user-friendliness of the tools has only exacerbated this burden. Today it's all too easy to press a button and communicate a result which may be utter nonsense. In a recent article in Nature titled "Research tools: Understand how it works", Daniel Piston from Vanderbilt laments the fact that many modern instruments and techniques have turned into black boxes that are being used by students and researchers without an adequate understanding of how they work. While acknowledging the undoubted benefits that automation has brought to the research enterprise, Piston points out the flip side:

Unfortunately, this scenario is becoming all too common in many fields of science: researchers, particularly those in training, use commercial or even lab-built automated tools inappropriately because they have never been taught the details about how they work. Twenty years ago, a scientist wanting to computerize a procedure had to write his or her own program, which forced them to understand every detail. If using a microscope, he or she had to know how to make every adjustment. Today, however, biological science is replete with tools that allow young scientists simply to press a button, send off samples or plug in data — and have a result pop out. There are even high-throughput plate-readers that e-mail the results to the researcher.

Indeed, and as a molecular modeler I can empathize, since modeling presents a classic example of black-box versus nuts-and-bolts approaches. On one hand you have the veteran programmers who did quantum chemistry on punch cards, and on the other hand you have application scientists like me who are much more competent at looking at molecular structures than at code (you also have those who can do both, but these are the chosen few). There's a classic time spent vs benefits accrued tradeoff here. In the old days (which in modeling lingo go back only fifteen years or so), most researchers wrote their own programs, compiled and debugged them and tested them rigorously on test systems. While this may seem like the ideal training environment, the fact is that in modern research environments and especially in an industry like the pharmaceutical industry, this kind of from-scratch methodology development is often just not possible because of time constraints. If you are a modeler in a biotech or pharma company, your overlords rightly want you to apply existing software to discover new drugs, not spend most of your time writing it. In addition, many modelers (especially in this era of user-friendly software) don't have strong programming skills. So it's considered far better to pay a hefty check to a company like Schrodinger or OpenEye who have the resources to spend all their time perfecting such programs. 

The flip side of this however is that most of the software coming from these companies is not going to be customized for your particular problem, and you can start counting the number of ways in which a small change between training and test sets can dramatically impact your results. The only way to truly make these programs work for you is to look under the hood, change the code at the source and reconfigure the software for your unique situation. Unfortunately this runs into the problem stated above, namely the lack of personnel, resources and time for doing that kind of thing.

So how do you solve this problem? The solution is not simple, but the author hints at one possible approach when he suggests providing more graduate-level opportunities to learn the foundations of the techniques. For a field like molecular modeling, there are still very few formal courses available in universities. Implementing such courses will give students a head-start in learning about the relevant background, so that they can come to industry at least reasonably well-versed with the foundations and subsequently spent their time actually applying the background instead of acquiring it. 

The same principal applies to more standard techniques like NMR and x-ray diffraction. For instance, even today most courses in NMR start with a basic overview of the technique followed by dozens of problems in structure determination. This is good training for a synthetic chemist, but what would be really useful is a judiciously chosen list of case studies from the current literature which illustrate the promises and pitfalls of magnetic resonance. These case studies would illustrate the application of NMR to messy, real-world problems rather than ideal cases. And it's only by studying these case studies that students can get a real feel for the kind of problems for which NMR is really the best technique.

Thus, gaining a background in the foundations of a particular technique is only one aspect of the problem, and one which in fact does not sound as important to me as getting to know the strengths and limitations of the technique. To me it's not as important to formally learn quantum chemistry as it is to get a feel for the kinds of systems for which it works. In addition you want to know what the results really mean, since the numbers you get from the output are often more or less informative than they look. Learning the details of perturbation theory is not as key as knowing when to apply it. If the latter is your goal, it may be far more fruitful to scour the literature and get a feel for the circumstances in which the chosen technique works rather than just take formal classes. And conveying this feel for strengths and limitations of techniques is again something we are not doing very well in graduate school, which we should be.

"The Thexperiment Cafe": Bridging theory and experiment?

Discodermolide and dictyostatin are complex, flexible molecules that bind to the protein tubulin and promote the assembly of microtubules during cell division. This mechanism, similar to that of the bestselling drug Taxol, derails the precise timing of cell division and kills cells by causing apoptosis or cell death. Since cancer is quintessentially a disease of aberrant cell division, both molecules have emerged as potentially promising anticancer agents. Discodermolide and dictyostatin are of special interest not only because of their extraordinary potency, but especially because they seem to retain that potency against cells which have become resistant to taxol.

A year ago I co-authored a J. Med. Chem. paper that proposed a protein-bound conformation for discodermolide using a combination of NMR data and molecular modeling techniques. We followed up with a paper published last week in JACS in which we applied similar techniques to dictyostatin. In a nutshell, the two studies revealed surprising and unexpected dissimilarity in the solution and protein-bound 3D conformations of the molecules; similarity which is belied by their superficial 2D structures. While dictyostatin presents a diverse family of conformations, discodermolide sustains a remarkably constant conformation in very diverse environments (solid-state, solution, and in the protein binding site) that is enforced primarily by steric factors.

I would like to describe the work in the latest paper separately, but for now I am intrigued by another aspect of the problem. In both cases we proposed protein-bound conformations of two medicinally relevant molecules, but in both cases our conformations were not unique. In case of dictyostatin there is at least one alternative proposed conformation while in case of discodermolide there are no less than two. Of course we think that our proposed conformation better satisfies the data (otherwise we wouldn't have published the papers!), but the fact is that we are now presented with a puzzle. Which of the proposed conformations is correct and what technique would best resolve the quandary? The answer is unambiguous: x-ray crystallography on dictyostatin and discodermolide bound to tubulin should tell us what the correct conformation is.

Max Perutz once said that one of the most attractive qualities of science is that there is usually only one right answer, unlike politics where the answer depends on the viewpoint. I think this example illustrates that quality. The question is well-defined. We now have several competing proposals for the protein-bound conformations of two important molecular targets, but we know that there must be only one bound conformation in the solid-state, one right answer. Which conformation among these is it? Or is it a totally different one which has slipped through the cracks? The question is important not only because it would reveal the mode of action of a potentially novel class of anticancer drugs, but also because it could be very useful to organic chemists who could then modify the structures of the drugs based on their bound conformations to improve their potency and other properties.

In case of discodermolide, one molecule, three proposed conformations. But only one true conformation to rule them all. Which is it? In one sense the gauntlet has been thrown in front of crystallographers and the goal should be tantalizing for them, especially because there is a single right answer. The task will undoubtedly be difficult. Until now only the tubulin-binding drug taxol has succumbed to x-ray crystallography while the drug epothilone has lent itself to electron diffraction. Both dictyostatin and discodermolide are flexible molecules that won't yield to protein co-crystallization easily. And yet the solution would almost certainly result in publication in a top journal and new directions for synthetic chemists. Most importantly, it would be the definitive validation of a scientific puzzle that is currently unresolved.

But this train of thought brings to my mind another idea. Wouldn't it be great if we could have an exclusive website where experimentalists post results that theorists have to explain and theorists post results that experimentalists have to validate? The interplay between theory and experiment has of course been the bedrock of science since antiquity. But all too often, the right kind of puzzle is not clearly communicated by one group to another. Sure, if you work in a particular field, you will probably be up to speed on the literature in your field. But the sheer deluge of information ensures occasional omission, and sometimes you may also be interested in potential challenges from other areas which cannot be easily communicated to you. For instance, the dictyostatin/discodermolide puzzle may be interesting to scientists who don't have anything to do with tubulin but who are simply eager to test a new structure determination method that can be applied to such complicated molecules. As we all know, solutions to scientific puzzles can emerge from unexpected corners, and scientists sometimes may find surprises from other fields that pique their curiosity. For example, the spectacular harnessing of physics-based methods in chemistry and biology is well-known.

Yet scientists in one field cannot possibly keep track of all other fields whose developments may be attractive to them. For instance a physicist who may be developing a promising new electron diffraction technique, potentially applicable to tubulin and discodermolide, is usually not going to be aware of literature in this area. In such cases, it would be tremendously useful to have a website whose express purpose is to serve as a bridge between theorists and experimentalists. The website would be divided into the traditional fields of science along with interdisciplinary sections. Every week, a theorist or experimentalist would pose a puzzle from his or her field whose unambiguous solution he or she believes would be amenable to experimental techniques. The puzzle would be tagged with the names of all possible fields to which it could be relevant. People could vote up or down a problem which they find particularly enticing and tractable. Experimentalists from different disciplines can then take a look at the problem. The right answer could come from left field, from quarters which were completely unexpected for the scientist who posed the question. There would still be some querying that would be necessary, but the specific nature of the website would necessitate far less wading through literature from other fields than what's usually required. Similarly, experimentalists could post curious, unexplained results that would tickle theorists' grey cells.

The website could perhaps be called "The Thexperiment Cafe" or something less obnoxious. It would be a place where theorists and experimentalists rendezvous and challenge each other with specific puzzles. It could bypass the usual exhaustive literature searching and serve as a rapid delivery vehicle for problems whose solutions are unambiguous (or even ambiguous!) and which could benefit members from each camp. Experimentalists and theorists could be one big, happy family. And science will always win.

Gernot Frenking is not happy...not at all

Stable is simply "able" with a "st"

ResearchBlogging.org

Wow. This is a first for me. Three of the heavyweights in theoretical and computational chemistry have published a set of prescriptions in Angewandte Chemie for theoretical chemists claiming to have discovered new, "stable" molecules. In response, Gernot Frenking who is a well-known chemist himself has not just published a piercing and trenchant critique in reply to this article, but they actually seem to have reproduced the text of his referee's comments as a reply. This is a lively and extremely readable debate.

In an article asking for more "realism" from theory, the three heavyweights- Roald Hoffmann, Paul von Schleyer and Henry Shaefer III- have basically come up with a roster of suggestions in response to what they see as the rather flippant declarations by theoretical chemists of molecules as "stable". One of the annoying things about theoreticians is that they regularly analyze molecules and proclaim them as stable. Experimentalists then have to sweat it out for years to actually try to make these molecules. Frequently such molecules are stable under rather extreme conditions, for example in gas phase at 4 degrees kelvin. To address the animosity that experimentalists feel against such carefree theoretical predictions, the three chemists have come up with suggestions for publication.

They make some interesting points about criteria that should be satisfied when declaring molecules as stable. In fact they think that one must do away with the word "stable" and replace it by the words "viable" and "fleeting". For example for "viable" molecules, one has to be clear about the difference between thermodynamic and kinetic stability. Molecules described as viable by theoreticians must have half lives of about a day, must be isolable in condensed phases at room temperature and pressure, and must not react easily with oxygen, nitrogen and ozone (?). Molecules with more than +1 positive or negative charge must also be included with "realistic" counterions. Molecules must even be stable under conditions of some humidity. The authors then also make suggestions about reporting accuracy and precision, and about the well-known fact that theoretically reported precision cannot be more than experimentally measured precision.

If theoreticians think these suggestions are asking for too much, they have a friend in Gernot Frenking.

Frenking batters these suggestions down by basically launching two criticisms:
1. The suggestions are too obvious and well-known to be published in Angewandte Chemie
2. The suggestions are heavily biased towards experimentalists' preferences
As Frenking puts it, he expected to walk into a "gourmet restaurant", and was served a "thin soup" instead. Ouch.

I have to say that while the suggestions made by the three prominent scientists are quite sound, Frenking's points are also well-taken. He lambasts the suggestions that realistic counterions should be included in the calculation of a molecule with multiple charges; there are already molecules with multiple charges predicted to be theoretically stable which were then isolated by experiment. Ionic molecules with charges more than + or -1 are easily isolated in condensed phases. And one of the central questions Frenking asks is; why does a molecule need to be so experimentally stable in order to justify the publication of its theoretical existence. After all there are many molecules present in interstellar space which cannot be isolated under average Joe lab conditions. Under these circumstances, Frenking is of the opinion that the distinction between "viable" and "fleeting" is "eyewash" (it's the European way of euphemism)

I resoundingly agree especially with this contention, harsh as it sounds. Why should experimentalists get an easy pass? The whole point of theory is to push the boundaries of what's experimentally possible. To suggest that one should only publish a theoretical prediction if it can easily be verified by experiment is to do disservice to the frontiers of science. While I can understand the angst that an experimentalist may feel when he sees an unusual molecule stable only under extreme conditions declared by a theoretician as "stable", that's exactly the challenge experimentalists should be up to, to devise conditions under which they can observe these short-lived molecules. If they do this they are the ones who carry the day. Since stability as is well-known is a relative term anyway, why insist on calling something "stable" only if it satisfies the everyday lab conditions of the experimentalist. I believe that it is precisely by testing the extreme frontiers of stability that chemistry progresses. And this can be done only by making things hard for experimentalists, not easy. Theoreticians pushing experimentalists and vice versa is how science itself progresses, and there is no reason for either one of them to quit questioning the boundaries of the others' domain.

There are other points and criticisms worth reading, include other referee comments which endorse the article and are also quite interesting. In the end however, I cannot answer Frenking's central question; should this article have been published in Angewandte Chemie? We should leave it for readers to judge.

Roald Hoffmann, Paul von Ragué Schleyer, Henry F. Schaefer III (2008). Predicting Molecules - More Realism, Please! Angewandte Chemie International Edition, 47 (38), 7164-7167 DOI: 10.1002/anie.200801206

Gernot Frenking (2008). No Important Suggestions Angewandte Chemie International Edition, 47 (38), 7168-7169 DOI: 10.1002/anie.200802500