December 2022: rust(y) Economics

My sort of person likes to make fun of economists. Maybe it's the combination of a socialist's disdain for the capitalist model paired with the ability of a physics major to do better maths and study an actual science. However, that doesn't mean that economics is totally useless, or doesn't make for a fun toy. Maths for economists is in essence the study of Brownian Motion on an inclined surface, and statistics for example make a decent model. I've been approached by a friend to maybe make some minor contributions to a larger project, the details of which I'll omit here, however, the project is coded in rust and deals in heavy statistics. I haven't dealt with hard statistical maths since that one lecture about a year or so ago, so I'm expecting to be in need of a refresher, and I've never touched rust in my life, both of which I'll have to amend before being able to do anything substantial. As I have about a week at the end of the month, I'm expecting to be able to put a little more time into this for the maths bit. I'm expecting to approach this part similar to the way I approach the Algebra and Theoretical Physics stuff, and I'll add some code for distributions or handy computations, in rust.

So December is one of those months that you learn not to pay too much attention to if you're an on/off working musician. I somehow did not think of that, since I haven't been an active working musician for a couple of years, but this time gigs had been eating my weekends while I wasn't looking. I really got started around the 25th, so I was keeping a tight schedule. The upside of this is, that I did that after I did a complete new system installation for my main work machine. I had been running Backbox 7 up till late December, when I noticed that installations I wanted to make were starting to be unhappy with my already installed packages like libc and glibc, which was quite worrying. My system had reached its breaking point between my liberal tinkering and just being outdated. Looking back, that might be, why the last few software related things I showcased even needed an Installation section. This time, with a fresh copy of Backbox 8, and without all the reversed lobotomies left in the system, the steps listed on the rust setup page worked as intended. I immediately noticed that this looked a little like C++, which I have moderate experience with. So much for the rust part, now we have to do some maths.

We'll be doing statistics for economists (first), so when in Rome... I've been recommended a book called An Introduction to Quantitative Finance by Stephen Blyth. I don't have the time to go too deep into the material, so I'll just have to do the good old pre-exam skimming, find some nice descriptions of distributions and write a corresponding functions, d'accord? D'accord.

The book looks pretty reasonably paced, and I thought I might just take the most important examples and chart them into a function. In my attempt to do that, I noticed something that broke my brain for a solid ten minutes.

This "proof" makes my nose bleed. This sort of reasoning seems wrong and unwise to use as a proof for anything, when the only definition of a zero coupon bond is that it evaluates to 1 at it's maturity. There's a bunch of assumptions made here that are left unsaid, most of which are very much non-trivial and should by all rights be stated somewhere in the near vicinity of the proof. I think this might be how mathematicians feel like when the physicist neglects to mention that all calculations happen in Hilbert Space.

The easiest calculation in financial math should still be familiar to everybody. Some initial value experiences a constant percentage growth over time. After any arbitrary time, what is the value? There's some extra parameters in my function, since I want it to be very customizable, that's why I implement a delta that tells the function relative to the start and end how many increments it's supposed to calculate. Technically I could replace all three of them for something called steps and use that for the exponent, but this way works too.

fn interest_constant(
    value: f64,
    delta: f64,
    rate:  f64,
    start: f64,
    end:   f64
) -> f64 {
    value * exp(rate * (end - start) / delta)
}

I think the way rust indicates the type of its return value is interesting. I for one haven't seen it done this way before. Otherwise it's quite comprehensive. The return value is denoted by the line of calculation not ending in a semicolon, which would otherwise make it a statement, and trigger some compiler issue, no doubt. I haven't tried yet though. I'm sure I'll make that mistake organically at some point.

I spent an hour or trying to read in a file, but I remember now why I kept having trouble with C++. There's a bunch of Structures that are liberally used for return values of basic functions that makes it difficult for somebody using my approach to tinker around quickly. Instead I tried getting further into the book and wrote some very basic statistical functions. I'm sure rust has functions to evaluate means or standard deviations of a vector, but writing some easy stuff yourself gets you used to how the language handles pointers and all that.

fn compute_mean (
    in_vec: &Vec<f64>
) -> f64 {
    let mut mean = 0_f64;
    for i in in_vec {
        mean += i
    }
    let divisor: f64 = in_vec.len() as f64;
    mean / divisor
}

fn standard_deviation (
    in_vec: &Vec<f64>,
) -> f64 {
    let mut sd = 0_f64;
    let mean = compute_mean(in_vec);
    for i in in_vec {
        sd += (i - mean).powf(2.0);
    }
    let divisor: f64 = in_vec.len() as f64;
    (sd / (divisor - 1.0)).sqrt()
}

For example, I'm very much used to C doing pointers using asterisks, rust seems to work more with the dereferencing ampersand. The usage of vectors is also not something I'm strictly speaking used to, since even though I should ordinarily use them with numpy and such, when you're used to arrays it's often just faster to do it that way. Here I can't really avoid them, since vector lengths are mutable, array lengths aren't, which disqualifies arrays for usage to me in most cases. That's fine though, even though having vectors restricted to one datatype might come back to bite me, when I come up with one of my patented terrible, but technically not wrong solutions.

What Stephen Blyth has taught me is that my preconceptions about economists are mostly correct, but also that economists probably have great reading comprehension. It's also arguably a good book to get an overview of economic jargon and the systems involved in transactions, not so much the analytical maths if you're not planning to model an economy. Instead, I've decided to shelf that for some light reading for the future (there's exercises I might do, if I'm not too short on time). Probability Theory might be more what I'm looking for, and also what I had originally been looking for. However, it's now almost the 28th and I don't get the last two days of December (probably). What this leaves me with is maybe one or two intense sessions of coding, and learning as much as I can while keeping the Blyth on the side as a nice little handbook that teaches me five to ten economic ideas a day - each of very variable veracity. What'll definitely come in useful might be a method to approximate a curve into a polynomic function. The method to do that is spline interpolation. As opposed to say a sine/cosine interpolation, splines have the advantage of choosing the approximation of the smallest possible order while keeping the curve smooth. You will want that in economics, because curves that are too jittery don't make for great predictors. Multiple Fourier-approximated signals do tend to come out vaguely box-shaped after all, which is bad news, since the function is really only smooth in theory at that point.

let mut keys: Vec<Key<f64,f64>> = [].to_vec();                    
let mut idx = 0;                                                  
for i in vects {                                                  
    keys.push(Key::new(f64::from(idx), i, Interpolation::Linear));
    idx+=1;                                                       
}                                                                 
let spline = Spline::from_vec(keys);                              
println!("Spline at t = 5.5 is {:?}", spline.sample(5.5));        
let oob: f64 = idx as f64;
println!("Spline 0.8 seconds before end of record might be {:?}",            
    spline.sample(oob - 0.8));

An interesting note is that the rust compiler is really helpful for a newbie like myself for identifying possible solutions. The whole mutable/not mutable and pointer business gets resolved quite easily whenever I followed the suggestions. It also gives a really helpful piece of error code leading to a documentation like this.

Okay, last thing. Since there's a very easy definition of an approximated derivative, that's something I can play with (and good thing I did, because I discovered the wonder of the .unwrap() property). First, the function.

fn approx_derivative(
    func:       &splines::Spline<f64, f64>,
    point:      f64,
    delta_x:    f64
) -> f64 {
    // let a: f64 = func.sample(point + delta_x) as f64;
    // let b: f64 = func.sample(point) as f64;
    let a = func.sample(point + delta_x).unwrap();
    let b = func.sample(point).unwrap();
    (a - b) / delta_x
}

I'm relatively happy with how easy handling of pointers seems to be in rust. Maybe it's because I don't need to have it go both ways, so I can just turn off my brain. I left in two lines that don't work, but commented them back out to illustrate something. The lines I commented out are an attempt of mine to wrangle return values that should really already be in f64 into values of f64, but return values seem to often come out as something called Option<f64>, which is a value, with an optional error code attached. I get how that structure could be interesting, but I can't help but think back to Error enums in C. Anyway, as long as there is the .unwrap() property to fetch the values of the Option<>, I'm happy enough. Usage of the function looks a little bit like this:

let x = approx_derivative(&spline, 4.0, 0.01);

And it works. Well, all in all this was a positive, if ever-so-slightly rushed, experience I feel. I might even prefer rust to C, certainly to C++, and it seems to be running very quickly (I'm assuming my expectations haven't been dampened that much by my old system). The whole cargo CLI is also a huge bonus and I prefer it to having to write and link my own Makefiles, of CMakeFiles, which is something I never managed to anchor well enough in my brain to not forget it after a week of not thinking about it. Maybe this will change, if the projects start becoming more complex, but so far I don't see this for example inhibiting the decision to start a new project in rust, purely on the basis of convenience.

Previous
Previous

January 2023: Spring Cleaning

Next
Next

November 2022: Scilab vs Mathics