## A simple bistable system

Here is a simple bistable system I sometimes use in class. I like it because it’s easy to explain plus the addition of one extra reaction turns it into a relaxation oscillator.

```import tellurium as te
import random
import pylab as plt
import numpy as np

# ---- BISTABLE SYSTEM -----

\$Xo -> S1;  0.5 + Vmax*S1^n/(15 + S1^n);
S1 -> ;    k1*S1;

Xo = 1; S1 = 1;
n = 4; Vmax = 10; k1 = 2;
''')

plt.xlabel ('Time')
plt.ylabel ('Concentration')
for S1 in np.arange (1.45, 1.55, 0.01):
r.S1 = S1
m = r.simulate (0, 5, 100)
plt.plot (m['time'], m['[S1]'])

for i in range (10):
r.S1 = random.random()*5
print ('S1 = ', r.S1, 'eigenvalue = ', r.getFullEigenValues())
```

If you run the above code you’ll get:

S1 = 5.145227 eigenvalue = -1.84055
S1 = 5.145227 eigenvalue = -1.84055
S1 = 0.251329 eigenvalue = -1.95768
S1 = 5.145227 eigenvalue = -1.84055
S1 = 1.492258 eigenvalue = 3.004970
S1 = 5.145227 eigenvalue = -1.84055
S1 = 1.492258 eigenvalue = 3.004970
S1 = 5.145227 eigenvalue = -1.84055
S1 = 5.145227 eigenvalue = -1.84055
S1 = 0.251329 eigenvalue = -1.957684

The above output is what I call the poor-mans way of finding multiple steady states. I scan across the inital conditions computing the steady state in each case and the corresponding eigenvalue (You can also just assign random values to the initial value for S1, which is better if you don’t know what starting conditions to use). As you can see it found three steady states, two stable (neg eignvalue) and one unstable (1.492, pos eigenvalue).

The plot above shows time course trajectories for about 10 different starting points for S1. The two steady states are clearly evident.

Out of interest here is the addition of the extra reaction at the front to turn it into a relaxation oscillator. You have to make a slight modification to the second reaction’s rate law so that it becomes a function of its reactant (which it wasn’t before)

```      \$Xo -> So;  k0*Xo;
So -> S1;   k1*So + Vmax*So*S1^n/(15 + S1^n);
S1 -> ;     k2*S1;

Xo = 1; S1 = 1; k0 = 0.07; k1 = 0.04
n = 4; Vmax = 6; k2 = 0.1;
```

## Simulate Regular Doses using Antimony/SBML in the Tellurium Python Tool

Someone recently asked how to create a model where a molecular species received a regular dose at a fixed interval of 12 hours.

They wanted something where a species A is being degraded but at regular intevals is increased by a fixed amount. See the plot below.

Lucian Smith and I came up with two possible solutions which are shown below. The first solution from Lucian is to set up a variable called trigger that we define using the differential equation trigger’ = 1 (note the apostrophe). The ode ensures that trigger increases linearly during the simulation. We then set up an event that triggers when trigger is greater than 12 hours at which point trigger is set back to zero and we add the dose of value 3 to species A. trigger then starts to integrate again from zero and the cycle repeats.

```import tellurium as te

A ->; k1*A;
k1 = 0.1; A = 10

trigger = 0
trigger' = 1

at (trigger>12): A = A+3, trigger=0
""")

m = r.simulate (0, 100, 100, ['time', 'A'])
r.plot()
```

The second solution is less elegant but increases the trigger value when an event occurs. In the code below when time exceeds the current trigger value we add the dose then increase trigger by 12 hours so that can it can be triggered again.

```m = r.simulate (0, 100, 100, ['time', 'A'])
r.plot()

A ->; k1*A;
k1 = 0.1; A = 10

trigger = 12
at (time > trigger): A = A+3, trigger = trigger + 12
""")

m = r.simulate (0, 100, 100, ['time', 'A'])
r.plot()

```

## One of my best friends passed away today

It is with enormous and deep sorrow that one of my best friends in the world passed away today as a result of Hemangiosarcoma. This is where the spleen becomes cancerous which eventually results in fatal hemorrhaging. There is virtually no treatment for this. She would likely pass away on her own accord within 24 hours. We didn’t know whether she was in pain or not, she rarely complained about anything, but we suspected she was. She looked absolutely miserable. We decided to take the incredibly difficult decision to put her to sleep instead of letting her suffer for another 24 hours. An incredibly sad day for all the family.

We got her 12.5 years ago as a puppy, full of life. We named her Rhody after the blossoming rhododendrons in our garden. She was a black Labrador, the sweetest dog you could imagine. She would wait for me outside to come home from work, and if she was in the house she’d listen for the sound of the bus that brought me home. When I entered the house she was almost always the first to greet me. She had an uncanny skill of measuring time. Her afternoon meal was at 4pm and somehow she knew when that time came. She would come to me or my wife looking at us in anticipation for the late lunch, often exactly at 4pm. The same thing would happen at 8pm for her evening snack.

This morning at 7.15am (28 Sept), we heard a loud falling sound. Rhody had collapsed, her legs had given away. We rushed downstairs, she was lying flat out at the bottom of the stairs where she usually waited for us to come down stairs in the morning. She couldn’t move and was clearly in a lot of distress. We took her to the vet, her ailment was diagnosed and we faced the abyss of having to lose her. It was an incredibly traumatic moment. You do not want to go through what we went through. It was unbearable. She was still conscious, her faculties were intact but her body was failing her and there was nothing that could be done about it. Imagine having to make the decision to let her go. For my wife and myself, this was the hardest and saddest decision we have ever made.

Rhody, we will miss you so deeply. May you rest in peace, our best friend and most loyal companion. Farewell.

## Generating log ranges in Python

This is a very short blog on how to generate logarithmic ranges in python:

```import numpy as np

# Brute force
x = [0.1, 1, 10, 100, 1000]

# Classic explicit loop
x = 0.1
for i in range (5):
print (x)
x = x*10

# List comprehension, more efficient
x = [10**i for i in range(-1,3)]
print (x)

# Lazy-evaluation (saves memory for very large lists)
for x in (10**i for i in range(-1,3)):
print (x)

# Using the built-in function in numpy
x = np.logspace (-2,2,num=5)
print (x)
```

## Explaining the smallest chemical network that can display a Hopf bifurcation

Last year I did a short blog on simulating the smallest chemical network that could display Hopf bifurcation oscillations. Here I want to revisit this with an eye to explain why it oscillates. The paper in question is

Wilhelm, Thomas, and Reinhart Heinrich. “Smallest chemical reaction system with Hopf bifurcation.” Journal of mathematical chemistry 17.1 (1995): 1-14.

In the blog I showed the figure that was given in the paper which was: The question is how does this network generate oscillations? In order to answer this question, we must first redraw the network. I’m going to make two changes to the figure.

You’ll notice that the reaction X + Y -> Y consumes and regenerates Y so that Y doesn’t actually change in concentration as a result of this reaction. Instead, we can treat Y as an activator of the reaction. In the paper the rate law for this reaction was k*X*Y, we leave this as is but change the reaction to X -> waste. This won’t alter the dynamics at all but we can reinterpret this reaction as something activated by Y without consuming Y

For the reaction X + Y -> Y, we can write the equivalent form

X -> waste; activated by Y: v = k*X*Y

The other interesting reaction is X + A -> X + X. This is what is called an auto-catalytic reaction, that is X stimulates its own production and this is key to the origins of the oscillations In the diagram, we can replace this in the diagram with X activating itself, in other words, a positive feedback loop. This reaction on its own has one steady state when X is zero. If X is not zero, the concentration of X tends to infinity at infinite time.

With these changes, we’ll redraw the network in the following way with the reaction numbers staying the same as those found in the original figure: In the new drawing, we can see a positive feedback loop in blue formed from the X -> X + X reaction and a delayed negative feedback loop in red that goes from X to reaction 2 via reactions 4 and 5. The negative feedback loop is negative with respect to X because increases in X will result in activation of reaction 2 resulting in a higher degradation rate of X. This is the classic structure for a relaxation oscillator, a positive feedback coupled with a negative feedback loop that causes X to turn on and off repeatedly. Let’s make a smaller version of the positive feedback unit using reactions 1, 2, and 4:

X -> X + X; k1*X
X -> Ao;  k2*Y*X
X -> Z; k4*X

The rate of change of X is given by:

dx/dt = k1 X – k2 X Y – k4 X

We can see that the rate of change of X can be positive or negative depending on the value for Y and X. At low Y, the rate of change of X will be positive but at high enough Y, the rate of change will turn negative. In other words, the system can be switched from an unstable to a stable regime by setting Y. If we set k2 = 1 and k4 = 1 then the cross over point from unstable to stable is when Y = k1 – 1. In the model k1 = 4, therefore the rate of change of X will switch stability when the level of Y is 3. See the time course plot below.

When X and Y are small the network is in an unstable state and X rises, this causes Y to rise but with a delay due to having to go through Z. However once Y reaches a threshold dictated by k1, k2, and k4, the rate of change of X goes negative and the system enters a stable regime. As X drops, so does Y which means the threshold is passed again but in the opposite direction and the system switches from a stable to an unstable regime. This loop continues indefinitely resulting in oscillations. ## Simple Stochastic Code in Python

It’s been a while since I blogged (grant writing etc gets int he way of actual thinking and doing) so here is a quick post that uses Python to model a simple reaction A -> B using the Gillespie next reaction method.

I know importing pylab is discouraged but I can never remember how to import matplotlib. pylab is an easy way to get matplotlib. It also imports scipy and numpy but I import numpy separately.

I use lists instead of array to accumulate the results because they are faster to extend as the data comes in. Note that with a Gillespie algorithm you don’t know beforehand how many data points you’ll get.

I’ll also make a shameless plug to my book Enzyme Kinetics for Systems Biology which explains the algorithm. ## New Textbook Published

I am pleased to announce the publication of my new textbook:

## Introduction to Metabolic Control Analysis “This book is an introduction to control in biochemical pathways. It introduces students to some of the most important concepts in modern metabolic control. It covers the basics of metabolic control analysis that helps us think about how biochemical networks operate. The book should be suitable for undergraduates in their early to mid years at college.”

Available at Amazon for \$49.95 or directly from me for only \$29.95 at books.analogmachine.org

### Content:

1, Traditional Concepts in Metabolic Regulation
2. Elasticities
3. Introduction to Biochemical Control
4. Linking the Parts to the Whole
5. Experimental Methods
6. Linear Pathways
7. Branched and Cyclic Systems
8. Negative Feedback
9. Stability
10. Stability of Negative Feedback Systems
11. Moiety Conserved Cycles
12. Moiety Conserved Cycles

Appendix A: List of Symbols and Abbreviations
Appendix B: Control Equations

## Repeatable, Reproducible [and Replictable]

There appears to be great confusion in the scientific and social sciences communities on the meaning of words related to certain aspects of the scientific method. The ArXiv paper by Lorena Barba “Terminologies for Reproducible Research” highlights the confused state that has appeared over the last 20 years. The words in question include:

Repeatability
Reproducible
Replication

I will dispense with replication simply because it’s too hard to say quickly (especially replicability) but see below for a more serious reason. The contention appears in the meaning of reproducibility or to reproduce. As Barba points out there are at least three ‘camps’ in this community which she labels, A, B1, and B2. The A camp makes no distinction, so we’ll forget about those. To describe B1 and B2 we must look at two extreme scenarios with respect to an experiment:

1. An experiment is carried out and is done again by the same author, using the same equipment, same methods, basically the same everything.

2. The experiment is carried out by a third-party using different equipment, different methods, etc. Basically, everything is different

In between these two extremes are variants, For example, the third-party could use the same methods but implement them independently of the original author, usually by reading the description given in the original published paper.

Given these descriptions, the B1 group calls the first scenario, ‘to reproduce the experiment’ while the second group, B2, calls the first scenario, ‘to replicate the experiment’ and there lies the contention.

Personally, I don’t like either of these terms as used here. As I mentioned before, replicability is a hard word to say. But not only that, from a dictionary perspective, it means the same thing as reproducibility. The Oxford English Dictionary describes replicability as “The quality of being able to be exactly copied or reproduced.” So why use two words, for two quite different things, where the two words have essentially the meaning?

My personal choice are the following two words, Rather than use the word replicability which seems redundant, I choose repeatability, hence:

Repeatability: means ‘to repeat the experiment again’, the word implies that the experiment was done exactly as before – Scenario 1

Reproducibility: means: ‘to recreate the experiment anew; reproduce implies creating a new thing, independently of the old – Scenario 2.

Of course one can get much more fine-grained, especially when it comes to computational experiments. But the fine graining can be included as levels within the class reproducibility.

Other than a change in wording from replicability to reproducibility, I appear to belong to camp B2. I should list others in camp B2, these include FASEB, NIST, 6 sigma, ACM, and Wikipedia and this Wikpedia page and The Physiome Project. I am sure there are others. For example, ACM writes:

Repeatability (Same team, same experimental setup)

The measurement can be obtained with stated precision by the same team using the same measurement procedure, the same measuring system, under the same operating conditions, in the same location on multiple trials. For computational experiments, this means that a researcher can reliably repeat her own computation.

Reproducibility (Different team, different experimental setup)

The measurement can be obtained with stated precision by a different team, a different measuring system, in a different location on multiple trials. For computational experiments, this means that an independent group can obtain the same result using artifacts which they develop completely independently.

Essentially the same definitions I gave above.

The National Academies recently studied this issue closely and is soon coming out with a report. It is apparently in favor of the more confusing option.

## How to do a simple parameter scan using Tellurium

A common task in modeling is to see how a parameter influences a model’s dynamics. For example, consider a simple two reaction pathway:

-> S1 ->

where the first reaction has a fixed input of vo and the second reaction a first-order rate laws k1*S1. The task is to investigate how the time course of S1 is influenced by vo.

The script below defines the model, then changes vo in increments and plots the effect on the pathway via a time course simulation. To do the parameter scan we exploit plotArray. This initially prevents the plots from being shown using show=False. We make sure that each plot gets a different color using resetColorCycle=False, finally, we show the plot using show(). To make things more interesting we also add a legend entry for each plot.

Note we call reset each time we run a simulation to ensure that S1 is reset back to its initial condition.

The following figure shows the resulting plot: Thanks to Kiri Choi for pointing out how to use plotArray in this way.

## How to plot a grid of phase plots using Tellurium

Let’s say we have a chemical network model where the species oscillate and you’d like to plot every combination of these on a grid. If so, then this code might be of help.

This code will generate: Here is an alternative that has removed the internal ticks and labels and arranges the column and row labels as contiguous. This is probably more what one might expect. This version plots all combinations including the transpose combinations. The following shows the steady state oscillations. This was done by simulating the model twice and using the plotting the second set of simulation results. 