Photo: Red poppy, Auckland Botanic Gardens, Auckland, New Zealand, by Sandy Millar via Unsplash.
As John West noted here last week, the Journal of Theoretical Biology has published an explicitly pro-intelligent design article, Using statistical methods to model the fine-tuning of molecular machines and systems. Lets take a closer look at the contents. The paper is math-heavy, discussing statistical models of making inferences, but it is also groundbreaking for this crucial reason: it considers and proposes intelligent design, by name, as a viable explanation for the origin of fine-tuning in biology. This is a major breakthrough for science, but also for freedom of speech. If the paper is any indication, appearing as it does in a prominent peer-reviewed journal, some of the suffocating constraints on ID advocacy may be coming off.
The authors are Steinar Thorvaldsen, a professor of information science at the University of Troms in Norway, and Ola Hssjer, a professor of mathematical statistics at Stockholm University. The paper, which is open access, begins by noting that while fine-tuning is widely discussed in physics, it needs to be considered more in the context of biology:
Fine-tuning has received much attention in physics, and it states that the fundamental constants of physics are finely tuned to precise values for a rich chemistry and life permittance. It has not yet been applied in a broad manner to molecular biology.
The authors explain the papers main thrust:
However, in this paper we argue that biological systems present fine-tuning at different levels, e.g. functional proteins, complex biochemical machines in living cells, and cellular networks. This paper describes molecular fine-tuning, how it can be used in biology, and how it challenges conventional Darwinian thinking. We also discuss the statistical methods underpinning finetuning and present a framework for such analysis.
They explain how fine-tuning is defined. The definition is essentially equivalent to specified complexity:
We define fine-tuning as an object with two properties: it must a) be unlikely to have occurred by chance, under the relevant probability distribution (i.e. complex), and b) conform to an independent or detached specification (i.e. specific).
They then introduce the concept of design, and explain how humans are innately able to recognize it:
A design is a specification or plan for the construction of an object or system, or the result of that specification or plan in the form of a product. The very term design is from the Medieval Latin word designare (denoting mark out, point out, choose); from de (out) and signum (identifying mark, sign). Hence, a public notice that advertises something or gives information. The design usually has to satisfy certain goals and constraints. It is also expected to interact with a certain environment, and thus be realized in the physical world. Humans have a powerful intuitive understanding of design that precedes modern science. Our common intuitions invariably begin with recognizing a pattern as a mark of design. The problem has been that our intuitions about design have been unrefined and pre-theoretical. For this reason, it is relevant to ask ourselves whether it is possible to turn the tables on this disparity and place those rough and pre-theoretical intuitions on a firm scientific foundation.
That last sentence is key: the purpose is to understand if there is a scientific method by which design can be inferred. They propose that design can be identified by uncovering fine-tuning. The paper explicates statistical methods for understanding fine-tuning, which they argue reflects design:
Fine-tuning and design are related entities. Fine-tuning is a bottom-up method, while design is more like a top-down approach. Hence, we focus on the topic of fine-tuning in the present paper and address the following questions: Is it possible to recognize fine-tuning in biological systems at the levels of functional proteins, protein groups and cellular networks? Can fine-tuning in molecular biology be formulated using state of the art statistical methods, or are the arguments just in the eyes of the beholder?
They cite the work of multiple leading theorists in the ID research community.
They return to physics and the anthropic principle, the idea that the laws of nature are precisely suited for life:
Suppose the laws of physics had been a bit different from what they actually are, what would the consequences be? (Davies, 2006). The chances that the universe should be life permitting are so infinitesimal as to be incomprehensible and incalculable. The finely tuned universe is like a panel that controls the parameters of the universe with about 100 knobs that can be set to certain values. If you turn any knob just a little to the right or to the left, the result is either a universe that is inhospitable to life or no universe at all. If the Big Bang had been just slightly stronger or weaker, matter would not have condensed, and life never would have existed. The odds against our universe developing were enormous and yet here we are, a point that equates with religious implications
However, rather than getting into religion, they apply statistics to consider the possibility of design as an explanation for the fine-tuning of the universe. They cite ID theorist William Dembski:
William Dembski regards the fine-tuning argument as suggestive, as pointers to underlying design. We may describe this inference as abductive reasoning or inference to the best explanation. This reasoning yields a plausible conclusion that is relatively likely to be true, compared to competing hypotheses, given our background knowledge. In the case of fine-tuning of our cosmos, design is considered to be a better explanation than a set of multi-universes that lacks any empirical or historical evidence.
The article offers additional reasons why the multiverse is an unsatisfying explanation for fine-tuning namely that multiverse hypotheses do not predict fine-tuning for this particular universe any better than a single universe hypothesis and we should prefer those theories which best predict (for this or any universe) the phenomena we observe in our universe.
The paper reviews the lines of evidence for fine-tuning in biology, including information, irreducible complexity, protein evolution, and the waiting-timeproblem. Along the way it considers the arguments of many ID theorists, starting with a short review showing how the literature uses words such as sequence code, information, and machine to describe lifes complexity:
One of the surprising discoveries of modern biology has been that the cell operates in a manner similar to modern technology, while biological information is organized in a manner similar to plain text. Words and terms like sequence code, and information, and machine have proven very useful in describing and understanding molecular biology (Wills, 2016). The basic building blocks of life are proteins, long chain-like molecules consisting of varied combinations of 20 different amino acids. Complex biochemical machines are usually composed of many proteins, each folded together and configured in a unique 3D structure dependent upon the exact sequence of the amino acids within the chain. Proteins employ a wide variety of folds to perform their biological function, and each protein has a highly specified shape with some minor variations.
The paper cites and reviews the work of Michael Behe, Douglas Axe, Stephen Meyer, and Gnter Bechly. Some of these discussions are quite long and extensive. First, the article contains a lucid explanation of irreducible complexity and the work of Michael Behe:
Michael Behe and others presented ideas of design in molecular biology, and published evidence of irreducibly complex biochemical machines in living cells. In his argument, some parts of the complex systems found in biology are exceedingly important and do affect the overall function of their mechanism. The fine-tuning can be outlined through the vital and interacting parts of living organisms. In Darwins Black Box (Behe, 1996), Behe exemplified systems, like the flagellum bacteria use to swim and the blood-clotting cascade, that he called irreducibly complex, configured as a remarkable teamwork of several (often dozen or more) interacting proteins. Is it possible on an incremental model that such a system could evolve for something that does not yet exist? Many biological systems do not appear to have a functional viable predecessor from which they could have evolved stepwise, and the occurrence in one leap by chance is extremely small. To rephrase the first man on the moon: Thats no small steps of proteins, no giant leap for biology.
A Behe-system of irreducible complexity was mentioned in Section 3. It is composed of several well-matched, interacting modules that contribute to the basic function, wherein the removal of any one of the modules causes the system to effectively cease functioning. Behe does not ignore the role of the laws of nature. Biology allows for changes and evolutionary modifications. Evolution is there, irreducible design is there, and they are both observed. The laws of nature can organize matter and force it to change. Behes point is that there are some irreducibly complex systems that cannot be produced by the laws of nature:
If a biological structure can be explained in terms of those natural laws [reproduction, mutation and natural selection] then we cannot conclude that it was designed. . . however, I have shown why many biochemical systems cannot be built up by natural selection working on mutations: no direct, gradual route exist to these irreducible complex systems, and the laws of chemistry work strongly against the undirected development of the biochemical systems that make molecules such as AMP1 (Behe, 1996, p. 203).
Then, even if the natural laws work against the development of these irreducible complexities, they still exist. The strong synergy within the protein complex makes it irreducible to an incremental process. They are rather to be acknowledged as finetuned initial conditions of the constituting protein sequences. These structures are biological examples of nano-engineering that surpass anything human engineers have created. Such systems pose a serious challenge to a Darwinian account of evolution, since irreducibly complex systems have no direct series of selectable intermediates, and in addition, as we saw in Section 4.1, each module (protein) is of low probability by itself.
The article also reviews the peer-reviewed research of protein scientist Douglas Axe, as well as his 2016 book Undeniable, on the evolvability of protein folds:
An important goal is to obtain an estimate of the overall prevalence of sequences adopting functional protein folds, i.e. the right folded structure, with the correct dynamics and a precise active site for its specific function. Douglas Axe worked on this question at the Medical Research Council Centre in Cambridge. The experiments he performed showed a prevalence between 1 in 1050 to 1 in 1074 of protein sequences forming a working domain-sized fold of 150 amino acids (Axe, 2004). Hence, functional proteins require highly organised sequences, as illustrated in Fig. 2. Though proteins tolerate a range of possible amino acids at some positions in the sequence, a random process producing amino-acid chains of this length would stumble onto a functional protein only about one in every 1050 to 1074 attempts due to genetic variation. This empirical result is quite analog to the inference from fine-tuned physics.
The search space turns out to be too impossibly vast for blind selection to have even a slight chance of success. The contrasting view is innovations based on ingenuity, cleverness and intelligence. An element of this is what Axe calls functional coherence, which always involves hierarchical planning, hence is a product of finetuning. He concludes: Functional coherence makes accidental invention fantastically improbable and therefore physically impossible (Axe, 2016, p. 160).
They conclude that the literature shows the probability of finding a functional protein in sequence space can vary broadly, but commonly remains far beyond the reach of Darwinian processes (Axe, 2010a).
Citing the work of Gnter Bechly and Stephen Meyer, the paper also reviews the question of whether sufficient time is allowed by the fossil record for complex systems to arise via Darwinian mechanisms. This is known as the waiting-time problem:
Achieving fine-tuning in a conventional Darwinian model: The waiting time problem
In this section we will elaborate further on the connection between the probability of an event and the time available for that event to happen. In the context of living systems, we need to ask the question whether conventional Darwinian mechanisms have the ability to achieve fine-tuning during a prescribed period of time. This is of interest in order to correctly interpret the fossil record, which is often interpreted as having long periods of stasis interrupted by very sudden abrupt changes (Bechly and Meyer, 2017). Examples of such sudden changes include the origin of photosynthesis, the Cambrian explosions, the evolution of complex eyes and the evolution of animal flight. The accompanying genetic changes are believed to have happen very rapidly, at least on a macroevolutionary timescale, during a time period of length t. In order to test whether this is possible, a mathematical model is needed in order to estimate the prevalence P(A) of the event A that the required genetic changes in a species take place within a time window of length t.
Throughout the discussions are multiple citations of BIO-Complexity, a journal dedicated to investigating the scientific evidence for intelligent design.
Lastly, the authors consider intelligent design as a possible explanation of biological fine-tuning, citing heavily the work of William Dembski, Winston Ewert, Robert J. Marks, and other ID theorists:
Intelligent Design (ID) has gained a lot of interest and attention in recent years, mainly in USA, by creating public attention as well as triggering vivid discussions in the scientific and public world. ID aims to adhere to the same standards of rational investigation as other scientific and philosophical enterprises, and it is subject to the same methods of evaluation and critique. ID has been criticized, both for its underlying logic and for its various formulations (Olofsson, 2008; Sarkar, 2011).
William Dembski originally proposed what he called an explanatory filter for distinguishing between events due to chance, lawful regularity or design (Dembski, 1998). Viewed on a sufficiently abstract level, its logics is based on well-established principles and techniques from the theory of statistical hypothesis testing. However, it is hard to apply to many interesting biological applications or contexts, because a huge number of potential but unknown scenarios may exist, which makes it difficult to phrase a null hypothesis for a statistical test (Wilkins and Elsberry, 2001; Olofsson, 2008).
The re-formulated version of a complexity measure published by Dembski and his coworkers is named Algorithmic Specified Complexity (ASC) (Ewert et al., 2013; 2014). ACS incorporates both Shannon and Kolmogorov complexity measures, and it quantifies the degree to which an event is improbable and follows a pattern. Kolmogorov complexity is related to compression of data (and hence patterns), but suffers from the property of being unknowable as there is no general method to compute it. However, it is possible to give upper bounds for the Kolmogorov complexity, and consequently ASC can be bounded without being computed exactly. ASC is based on context and is measured in bits. The same authors have applied this method to natural language, random noise, folding of proteins, images etc (Marks et al., 2017).
The laws, constants, and primordial initial conditions of nature present the flow of nature. These purely natural objects discovered in recent years show the appearance of being deliberately fine-tuned. Functional proteins, molecular machines and cellular networks are both unlikely when viewed as outcomes of a stochastic model, with a relevant probability distribution (having a small P(A)), and at the same time they conform to an independent or detached specification (the set A being defined in terms of specificity). These results are important and deduced from central phenomena of basic science. In both physics and molecular biology, fine-tuning emerges as a uniting principle and synthesis an interesting observation by itself.
In this paper we have argued that a statistical analysis of fine-tuning is a useful and consistent approach to model some of the categories of design: irreducible complexity (Michael Behe), and specified complexity (William Dembski). As mentioned in Section 1, this approach requires a) that a probability distribution for the set of possible outcomes is introduced, and b) that a set A of fine-tuned events or more generally a specificity function f is defined. Here b) requires some apriori understanding of what fine-tuning means, for each type of application, whereas a) requires a naturalistic model for how the observed structures would have been produced by chance. The mathematical properties of such a model depend on the type of data that is analyzed. Typically a stochastic process should be used that models a dynamic feature such as stellar, chemical or biological (Darwinian) evolution. In the simplest case the state space of such a stochastic process is a scalar (one nucleotide or amino acid), a vector (a DNA or amino acid string) or a graph (protein complexes or cellular networks).
A major conclusion of our work is that fine-tuning is a clear feature of biological systems. Indeed, fine-tuning is even more extreme in biological systems than in inorganic systems. It is detectable within the realm of scientific methodology. Biology is inherently more complicated than the large-scale universe and so fine-tuning is even more a feature. Still more work remains in order to analyze more complicated data structures, using more sophisticated empirical criteria. Typically, such criteria correspond to a specificity function f that not only is a helpful abstraction of an underlying pattern, such as biological fitness. One rather needs a specificity function that, although of non-physical origin, can be quantified and measured empirically in terms of physical properties such as functionality. In the long term, these criteria are necessary to make the explanations both scientifically and philosophically legitimate. However, we have enough evidence to demonstrate that fine-tuning and design deserve attention in the scientific community as a conceptual tool for investigating and understanding the natural world. The main agenda is to explore some fascinating possibilities for science and create room for new ideas and explorations. Biologists need richer conceptual resources than the physical sciences until now have been able to initiate, in terms of complex structures having non-physical information as input (Ratzsch, 2010). Yet researchers have more work to do in order to establish fine-tuning as a sustainable and fully testable scientific hypothesis, and ultimately a Design Science.
This is a significant development. The article gives the arguments of intelligent design theorists a major hearing in a mainstream scientific journal. And dont miss the purpose of the article, which is stated in its final sentence to work towards establish[ing] fine-tuning as a sustainable and fully testable scientific hypothesis, and ultimately a Design Science. The authors present compelling arguments that biological fine-tuning cannot arise via unguided Darwinian mechanisms. Some explanation is needed to account for why biological systems show the appearance of being deliberately fine-tuned. Despite the noise that often surrounds this debate, for ID arguments to receive such a thoughtful and positive treatment in a prominent journal is itself convincing evidence that ID has intellectual merit. Claims of IDs critics notwithstanding, design science is being taken seriously by scientists.
- Announcing Six Winners of the Third Annual Manning/IALS Innovation Awards : Institute for Applied Life Sciences - UMass News and Media Relations - November 13th, 2021
- Lecturer Positions in Intelligent Systems Engineering job with Indiana University Bloomington / Luddy School of Informatics, Computing, and... - November 13th, 2021
- Pressure BioSciences Awarded Second U.S. Patent for Its Revolutionary Ultra Shear Technology Platform, for Its Innovative NanoGap Valve - Yahoo... - November 13th, 2021
- Novel Imaging Technique Takes High Resolution 3D Images of Cells - AZoNano - November 13th, 2021
- "Dialogues between world Laureates and Gen-Z" was successfully held: an event focusing on Gen Z and the future of Science - PRNewswire - November 13th, 2021
- NVIDIA Announces Jetson AGX Orin: Modules and Dev Kits Coming In Q1'22 - AnandTech - November 13th, 2021
- Research on Food and Beverage Air Filtration Market Key Data Points Necessary for Effective Strategies | 3M, APC Filtration, Camfil Group, Donaldson... - November 13th, 2021
- Scientists invent 'smart' window material that blocks rays without blocking views - Nanowerk - November 13th, 2021
- How can Australia get cracking on emissions? The know-how we need is in our universities - The Conversation AU - November 13th, 2021
- Machine learning links material composition and performance in catalysts - Nanowerk - August 24th, 2021
- What Is Nanotechnology And How Is It Impacting Neuroscience? - Forbes - August 24th, 2021
- Could Nanotechnology Help to End the Fight Against COVID-19? | IJN - Dove Medical Press - August 24th, 2021
- Are Radioactive Diamond Batteries the Solution to Nuclear Waste? - Interesting Engineering - August 24th, 2021
- Updating the PLOS ONE Nanomaterials Collection Author Perspectives, Part 2 - EveryONE - PLoS Blogs - August 24th, 2021
- Scientists Develop Woven Nanotube Fibers Capable of Converting Heat into Energy - AZoNano - August 24th, 2021
- This engineer is searching for signs of life in the clouds of Venus - Create - create digital - August 24th, 2021
- Ultra-Low IQ PMIC from ROHM Selected to Power NXP iMX8M Nano for High Performance Embedded Artists Industrial Control Board - EE Journal - February 20th, 2021
- Perseverance touches down on Mars and Jaguar going electric: 10 top stories of the week - Professional Engineering - February 20th, 2021
- mHealth Wearable Boosts Remote Patient Monitoring, Connected Health - mHealthIntelligence.com - February 20th, 2021
- New Skin Patch Brings Us Closer to Wearable, All-In-One Health Monitor - I-Connect007 - February 20th, 2021
- Next: Superconducting nanowires could be used in circuits - Electronics Weekly - February 20th, 2021
- 4 UCSD Researchers Win Sloan Research Fellowships for Early Career Scientists - Times of San Diego - February 20th, 2021
- Pendse named 2021 Distinguished Maine Professor - UMaine News - University of Maine - University of Maine - February 20th, 2021
- New Polymer Cores Added to Windows Could Solve Energy Issues for Buildings - AZoBuild - February 20th, 2021
- Caltech Professor Inducted into the National Academy of Engineering Pasadena Now - Pasadena Now - February 15th, 2021
- How Nanotechnology Has Improved the Auto Industry - Salon Priv Magazine - February 15th, 2021
- Caltech: Tai Inducted into the National Academy of Engineering - India Education Diary - February 15th, 2021
- Plastic-nanotube composite 'tougher and lighter than similar forms of aluminium' - Professional Engineering - February 15th, 2021
- New coalmine questioned and high-density hydro planned: 10 top stories of the week - Professional Engineering - February 15th, 2021
- Scientists study moving worm "blobs" to create robot swarms - Big Think - February 15th, 2021
- Riding the Funding Wave: These DFW Startups Got the Money in 2020 Dallas Innovates - dallasinnovates.com - February 15th, 2021
- UVA Honors Distinguished Researchers at Virtual Awards Event - University of Virginia - February 4th, 2021
- Nanostructure of the Anodic and Nanomaterials Sol-Gel Based Materials Application: Advances in Surface Engineering - Products Finishing Magazine - December 19th, 2020
- Evelyn Hu delivers 2020 Dresselhaus Lecture on leveraging defects at the nanoscale - MIT News - December 19th, 2020
- COVID-19 airborne transmission research suggests potential therapies | University of Hawaii System News - UH System Current News - December 19th, 2020
- Engineers awarded for ongoing research excellence - News - The University of Sydney - December 19th, 2020
- New Horizons for research through new adventurous research projects - The University of Manchester - December 19th, 2020
- New collaboration provides opportunity for future water scientists and engineers - Cranfield University - December 19th, 2020
- First of its kind at U of T: MIE launches specialized course in 3D printing - U of T Engineering News - December 19th, 2020
- How Integrated Operations is Using a Breakthrough Misting Technology to Stop the Spread of Deadly Viruses and Bacteria - Iosco County News Herald - October 30th, 2020
- Senior Research Assistant in Experimental Condensed Matter Physics job with THE UNIVERSITY OF HONG KONG | 230871 - Times Higher Education (THE) - October 30th, 2020
- Small Satellite Market Size to Hit USD 9.75 Billion by 2027; Presence of Several Large Scale Companies will have a Positive Impact on Market Growth,... - October 30th, 2020
- Impact Of Covid 19 On 3D Print Materials Industry 2020 Market Challenges, Business Overview And Forecast Research Study 2026 - The Think Curiouser - October 30th, 2020
- The U.S. Finally Has a Sputnik Moment With China - Foreign Policy - October 30th, 2020
- Medtronic Announces Adaptix Interbody System, the First Navigated Titanium Cage with Titan nanoLOCK Surface Technology - BioSpace - October 10th, 2020
- Nanocoolant Alternative for Cooling Elements by UMP Researchers - QS WOW News - October 10th, 2020
- Post Pandemic Nano Gas Sensor Market Size to Reach USD XX Million Billion by 2027 Analysis by Top Manufacturers Raytheon Company, Ball Aerospace and... - October 10th, 2020
- IISc Bengaluru researchers discover nanomotors can lead to early detection of cancer cells - The Indian Express - October 10th, 2020
- Elon Musk's million-mile battery: What it really means - Los Angeles Times - September 23rd, 2020
- Politecnico di Milano: Project for innovation in Nanotechnology is the winner of ERC Starting grant - Science Business - September 23rd, 2020
- Fruit flies' protective corneal coatings reproduced | Research - Chemistry World - September 23rd, 2020
- Toy Insider unveils its Hot 20 List of the year's most wished-for toys - FOX43.com - September 23rd, 2020
- Thin and ultra-fast photodetector sees the full spectrum - Science Codex - September 23rd, 2020
- DiOX carbon free DWR and anti-viral treatments launch in Turkey - Innovation in Textiles - September 23rd, 2020
- New Thin and Ultra-Fast Photodetector can See the Full Spectrum of Light - AZoOptics.com - September 23rd, 2020
- French Exotrail signs two contracts with ESA and stays on track for its electric propulsion mission - SpaceWatch.Global - September 18th, 2020
- Kanazawa University research: Potential drug treatment for particular type of lung-cancer - PRNewswire - September 18th, 2020
- Could Tattoo Ink Be Used to Detect Cancer? - Smithsonian Magazine - September 18th, 2020
- Simulation Software Market Key Drivers, Business Insights, Trends And Forecast 2026 Altair Engineering, Inc., Bentley Systems, Ansys, Inc, Ptc - The... - September 18th, 2020
- Faculty and Staff Achievements Summer 2020 - CSUN Today - September 18th, 2020
- Diomics Announces Agreement With Department of Defense to Accelerate Development and Testing of Diocheck SARS-CoV-2 Immune Response Indicator Patch -... - September 18th, 2020
- Ashok Leyland, Hindustan Zinc team up with IIT-M to develop Zinc air battery - BusinessLine - September 18th, 2020
- The next generation of American nuclear - Power Technology - September 18th, 2020
- Cloth masks are effective at reducing virus transmission because it spreads in respiratory droplets, which are larger than smoke particles and the... - September 18th, 2020
- Build your own robots with this kit on sale for $50.15 - Mashable - September 6th, 2020
- Phase engineering and the final frontier! - Advanced Science News - September 6th, 2020
- The Role of Electron Microscopy in Battery Research - AZoM - September 6th, 2020
- Copper-coated face masks could help slow transmission of COVID-19: U of T researchers - News@UofT - September 6th, 2020
- Impact Of Covid-19 on Graphene Market 2020 Industry Challenges, Business Overview and Forecast Research Study 2026 - Owned - September 6th, 2020
- Nano Gas Sensors Market 2020 Growth Opportunities and Revenue Statistics to 2025 By Top Players | Raytheon Company, Ball Aerospace and Technologies,... - September 4th, 2020
- Surgical Instruments Tracking Systems Market Predicted to Accelerate the Growth by 2018-2028 - Scientect - September 4th, 2020
- Food dyes, tattoo ink can be used to detect cancer: Study - BusinessLine - September 4th, 2020
- 3D Printed Implants Market: Growing Biomedical Applications of 3D Technology is Expected to Boost the Market - BioSpace - September 4th, 2020
- MIT partners with national labs on two new National Quantum Information Science Research Centers - MIT News - September 2nd, 2020
- A new platform for controlled delivery of key nanoscale drugs and more - MIT News - September 2nd, 2020
- Flexible Micro LEDs May Reshape Wearable Technology - Manufacturing Business Technology - September 2nd, 2020
- Carleton Faculty Receive CFI Funding to Support Research Benefiting All Canadians - Carleton Newsroom - September 2nd, 2020
- Nanotextiles Market Estimated size be driven size Innovation and Industrialization COVID-19 2024 - Chelanpress - August 10th, 2020
- Lecturer/Senior Lecturer in Internet of Things job with CRANFIELD UNIVERSITY | 218434 - Times Higher Education (THE) - August 8th, 2020
- The End of an Era as Tata Motors Prepare To Sell Their Passenger Car Business - Moneylife - August 8th, 2020