Book Review: Thinking in Systems

•October 14, 2015 • Leave a Comment


Thinking in Systems by Meadows & Wright was recommended by Patrick O’Shaughnessy at his blog. I also think his father’s book, “What Works on Wall Street”, is an excellent quantitative reference for longer term investment models.

Systems engineering is a thought exercise in deciphering the variables in any given system, how they are controlled, and what happens when inputs are changed. The book had me thinking about systems in all different facets of life from the markets to organizations, and politics. Some highlights:

“Look beyond players to the rules of the Game”

Purpose of a system is deduced by long term behavior and not from rhetoric and is the least obvious.

Instead of who’s the blame, ask what’s the system?

News is event analysis. It has little to no predictive/explanatory value but is most visible. #Twitter #CNBC

Competitive Exclusion Principle – Rich get richer. Solutions are to diversify (new markets) or level the playing field (inheritance tax)

Seeking wrong goals can result in producing effort but not results. Example given is focus on GNP/GDP versus Happiness or equality.

Changing people won’t change the system. A new president can change the inputs but the system and it’s outputs remain. #PresidentialElection

Leverage points, things that can alter systems, are counter-intuitive.

The book has examples of these highlights, I wish it had more. This seems like a fascinating subject area and applies very well to finance. Any readers with more suggestions on Systems Analysis please comment!


Random Notes on Python II

•April 27, 2015 • Leave a Comment

In continuation of an old post on Python, I’ve been playing around with an awesome new library built by P. Morissette simply titled BT. It includes numerous functions for back testing and displaying results & charts for daily strategies and lower frequencies.

Here’s an example of a simple momentum based tactical asset rotation strategy:


It has a function to weight stocks based on mean-variance optimization:


Spits out results for multiple strategies at once:


Even has a nicely formatted table for results:


Little correlation charting:


Technology is an amazing thing. I highly recommend you check out the library if you have interest:

Somewhat related, I’ve also been playing around with some machine learning libraries, namely SciKit Learn. However, I’m not quite certain it’s advantage in forecasting financial data? I remember reading up on Machine Learning a few years back, simple examples granted, but came away with the conclusion that it’s not that much more accurate than simple technical analysis tools. Please comment if you have more experience utilizing machine learning to trade stocks!

And here’s a little portfolio I put together based on bond return momentum and minimum variance weighting:


1.89 sharpe and 1/4 Max Drawdown of the TLT. Add a little leverage to get desired results 🙂

Spoofers be spoofing since spoofing been spoofing

•April 24, 2015 • Leave a Comment


There is good cause for the skepticism of the arrest of a UK trader in his parents basement for causing the flash crash as written here by Michael Lewis:

I started in a very active trading room, many of whom were ex-market makers where I first saw something like this in action. Spoofing has been around a very long time. In fact, one might say that Jesse Livermore used this type of strategy in “Reminiscence of a Stock Operator” when he was executing large orders. I haven’t read it yet but I was told to read “Evil Genius of Wall Street” about Jay Gould, an old time robber baron that also used a form of spoofing.

More interesting, I remembered a story of a guy known as “The Flipper” who allegedly made something like $50-$60MM spoofing the German Bund futures (he only claims to make > $5MM per year) and did about 180,000 cars/day ($70B notional) according to an interview in Trader Monthly circa 2005ish?

His strategy was described like this in an old forum post back in 2005:

He places massive size to run the market like 6000 bids and offers in the bund. Sell 100 into his 6000 bid and all of a sudden it’s 6000 offered. Buy another 100 into that offer and hey presto back bid 6000. Occasionally he gets rumbled like last week when the hedge funds came into the bund on the back of GM and ford and took out 7 prices of these ridiculous orders in one trade. All of a sudden the orders were down to 2/300 lots in the whole book. Whoever got taken got a shock. This type of trading only works in quiet markets when paper isn’t about. The flipper is no longer much of an issue as plenty of people match him in size and clip him in one go but a few months back he had the market in the palm of his hand.

Here is the Trader Monthly article:


Digging a bit deeper I found this quote from him:

Now you have the FBI investigating algorithmic trading because what the algorithmic traders do is market manipulation, but somehow it is legal. If I put in orders and then cancelled them, I could be accused of market manipulation. But they do it in micro-seconds at the rate of 100,000’s of times a day, putting in and cancelling orders, and no one really stops them.

So for these reasons I am now doing more mid- to long-term trades in a variety of markets and I don’t just concentrate on my core markets (bonds) which I was doing until I stopped short-term trading.

Fun with Leverage

•March 11, 2015 • Leave a Comment

An old post regarding a simple asset allocation model that beat the average performance of the top ten hedge funds of the past 15 years got me thinking. As an aside, the average performance will be highly biased upwards as we only know in hindsight what the top 10 hedge funds were. Thinking can be dangerous. So can leverage. In the previous Python post, I tested a simple RV timing tool in VXX and combined with TLT ETF as a proxy for bonds. When you combine the two, you can get something like this:

VXX Strategy Combined with Leveraged TLT ETF

VXX Strategy Combined with Leveraged TLT ETF

I randomly selected data from Jan 2010 to Jan 2011 to optimize the weightings. I constrained the VXX strategy to only 1x but tested as high as 3x on TLT. The 1.9 sharpe compares with 1.6 Sharpe of a 1x allocation to the VXX strategy. The last couple of months have been pretty rough on VXX strategies so here is a closer look from 2014:

Jan 2014 - March 10, 2015

Jan 2014 – March 10, 2015

BTW, all calculations include commissions!

Enhanced Covered Call Writing – AQR Paper

•March 4, 2015 • 1 Comment

Covered calls are an extremely popular strategy utilized by large institutions and retirees alike. As chase for yield continues it will likely continue to be popular as it is sometimes viewed as an equity income trade or enhance income trade. Take a look at S&P index skew or call skew for most S&P underlying to see the downward pressure in pricing of calls versus puts due to flows.

AQR put out an interesting paper in May 2014 which I will attempt to summarize:

1) Covered Call writing performance actually comes from 3 variables, the equity risk premium (stock market), the volatility premium (IV), and a new premium they call the equity reversal premium. While the first 2 should be self explanatory, and any positive method of forecasting either premium should enhance returns, the 3rd one is quite interesting and unique.

The Equity Reversal Premium can be thought of as a market timing strategy. The equity exposure (or exposure to equity risk premium) is a function of the delta of the short call. An ITM covered call actually has little to no exposure to the equity risk premium while an OTM covered call has almost 100% exposure to the equity risk premium. This should make sense?

Source: AQR

Source: AQR

2) So the variance of the equity exposure is similar to a market timing strategy where you buy more index as it moves down and sell the index as it moves up. If market is choppy, this should do well, in a trending market….not so much.

3) The paper suggests hedging the equity reversal premium and maintaining exposure to just the Equity Risk and Volatility Risk Premiums. They suggest delta hedging the option in addition to the equity exposure and the results look impressive:

Source: AQR

Source: AQR

The next step would be back-testing this strategy. Finding delta data might be difficult, as well as adding in commission costs of added trading! I also do not know the hedging frequency and duration of the call sold initially but I might be able to boot strap something!

First Loss Capital Model

•February 5, 2015 • Leave a Comment

I found it interesting how little I knew about the structure of ‘prop firms’ from an investor/partner perspective. After talking to some colleagues I’ve pieced together an idea of what it is which I will explain below. But full disclosure, I could have misunderstood the explanation!

The ‘first-loss’ model is very much like a brokerage account but with more leverage. It’s like FXCM for equities. Your primary concern is risk management (you don’t want to get FXCM’d or Madoff’d), amount of leverage available and trading costs. Once your equity is drawn down, your account gets margin called and likely liquidated. So if you deposit $50k, get $500k in capital but lose $50k P&L, your account is shut down by the fund. Keep in mind the difference between 100% loss and 10% loss. But what if you are the owner of the firm?

Typically someone will deposit say $10MM, either their own money or raised from investors and become the GP. They will then seek out additional investors who will deposit say $100MM into the same account. The second group are the LPs. The LP cash is NOT TO BE RISKED. Any losses will be absorbed by the GP FIRST and there is likely an understanding that should the GP equity be completely depleted, the positions will be liquidated and the initial 100MM will not be touched. In return, the LP will receive a guaranteed return backed by the revenue (P&L, commission, etc) before the GPs are paid out.

Here’s a quick example, Owner X raises $10MM as GP and Investor Y will invest $100MM. A Prime Brokerage account is opened and $110MM equity is deposited. They provide 5x on it, so through the magic of leverage you now have a $550MM hedge fund! Hire a bunch of guys, dole out the $550MM to each. Let’s say you make 10% return from the P&L cut and charging commission/financing costs or $55MM. The LPs are guaranteed 10% before GPs are paid, therefore they receive $10MM and the other $45MM goes to the GPs. Not too shabby.

But let’s say after 3 months, the firm suffers a $9MM loss. While the prime brokerage still may see $101MM equity and the capital available might be $505MM, the firm only has $1MM of equity to play with. They will either decide to fold, or severely cut positions as they can only absorb $1MM more in losses. On $505MM book size, a $1MM loss is a few ticks!

So in essence, a prop firm that requires no capital is simply allocating some of the GP money into your account as well as keeping a larger percentage of your profits. There really isn’t much difference between prop and ‘first loss’ other than where the initial deposit comes from structurally. Obviously payouts are quite different. It is interesting to see Leucadia getting into the game but from an LP standpoint, it’s actually quite an interesting investment. The principal is mostly secure with potential for 10% dividend, it’s similar to issuing a bond and there is nothing to say the LP can’t invest some money in the GP as well to retain upside! From the article, Leucadias investment is prefaced on the ability to raise $400MM of outside capital. I would imagine that $400MM will be ‘first loss’ capital. likely HF guys who’ve managed to raise $10 or $20 but not enough to be self sustaining. They can offer space, back office, compliance, marketing, etc and turn a 10MM fund into $50 or $100MM overnight.

Incidentally that is probably why a majority can’t or won’t pay salaries, they simply do not have the cash! There are pros and cons to the model, mostly revolving around the issue of leverage, it’s probably situation dependent. I just wondered where the large amount of capital came from, initially I thought it was simply from borrowing (and in a way it is) but more like a firm issuing a bond. Food for thought!

Random Notes with Python

•January 13, 2015 • 6 Comments

I admit I had a lot of trouble just getting up and running in Python. It probably should have been easier but programming languages seem to have gotten a bit more complex since Fortran and C++ where all you need was an editor and a compiler. So I’m sharing these notes to hopefully help anyone else out there who has an interest in using Python specifically for back testing trading strategies. Python can be used for modelling and forecasting and a whole host of other things but for my project I basically needed to download data, arrange it, run some calculations, and produce an equity curve. I wanted to test a simple VXX strategy based on Realized / Implied Divergence I read online somewhere.

1) Download & Install Anaconda 64-bit: This is what’s known as a ‘distribution’ for further reference. Anaconda comes pre-installed with most ‘modules’ one needs to turn it into a back testing engine. Things like NumPy, Matplotlib, SciPy, Pandas are already included and are necessary. It also utilizes IPython which I found to be an excellent tool since it splits bits of code into cells. There is no need to compile and run the entire code which saves time and helps error checking!


Figure 1

2) To run Ipython, find that little search button by moving the mouse all the way to the right. Yeah I know, most of you know this but I admit it took me awhile in Windows 8. Search for Ipython and run Ipython Notebook. This will open a command prompt and in turn open your browser to Ipython Notebook.

3) Next step is to get data from the Internet. There’s actually a lot of ways of doing this. If you want to use Pandas (which I believe is the ‘standard’ method) here is a tutorial: I utilized another module called tradingwithPython from this site: There you can download an executable and it installs automatically. Figure 1 is the code to download data from Yahoo. I think if you spend 5 minutes and have a little bit of programming background, the relevant lines of code should become apparent and then you can transform it to your needs. The DataFrame is pretty awesome as it automatically aligns the data by date! Otherwise the starting dates for the respective securities would be off and your back test won’t make much sense!

4) Figure 2 below is the guts of the strategy. It’s a bit more complicated to figure out but feel free to ask questions and I’ll do my best. This is really where some of the ‘learning’ occurs and going back from Ipython to Google search really helps. Running code in individual cells really shine as you don’t have to compile previous sections over again which may be very time consuming.

Figure 2

Figure 2

5) Downloading external modules was another relatively difficult task for me that should have taken 5 minutes lol. I downloaded the DX Analytics module: It has some really cool mean-variance optimization functions and some derivative modelling functions I haven’t really gotten into yet. The trick is putting the unzipped folder in the right place! I finally found out under C:\users\…\Anaconda\Lib\site-packages and just copied the whole folder into there. Ipython was able to find it when I imported any of the functions. Simple but it took me awhile lol.

These are just some notes which hopefully I will edit to and add in the future. This isn’t meant to be a complete tutorial but if you’re looking for a complete example, I learned a lot from going through this example strategy: Hopefully it helps you too!

Edit (1/17/15)

I came across a great little function to calculate peak to trough draw downs of an equity curve or stock. The output is the number of points, dollars, percentage of the maximum drawdown as well as the duration. I can use this function to optimize variables using minimal drawdown as a constraint!

SPY Maximum DD was 74 points and lasted 1656 days from 1994 to 2015.

SPY Maximum DD was 74 points and lasted 1656 days from 1994 to 2015.