Since everyone kept asking, I'll discuss an actual strategy this time. This is one of the first strategies I began working on when automating my manual trading. One of the principles I still maintain is doing manual analysis prior to creating a strategy. I pick equities based on the strength of the business and my belief in their future success. With that said, here is the disclaimer: Don't run this on a business with a decent chance of bankruptcy or failure. Dividend stocks are a good choice as are businesses that can't be allowed to fail, eg. defense. Basically, don't choose to run something like this on Hertz.
Necessary packages are numpy, scipy, pandas, and matplotlib. I'm going to look at RTX minute data ranging from 2020-04-03 to 2020-12-18. Minute data is required for any of the following analysis. The necessary columns in the dataframe are Close, Volume, and Trading Day. In the following code, Close==Last.
Let's get started by loading and plotting our data.
def load_data(ticker):
fname = 'Disk/Your/Filepath/Here/{}.csv'.format(ticker.upper())
data = pd.read_csv(fname,index_col='Time') #index looks like: YYYY-MM-DD HH:MM
days = pd.unique(data.TradingDay.values) #TradingDay looks like YYYY-MM-DD
return data,days
data,days = load_data('rtx')
plt.plot(data.Last.values)
plt.axvline(15300) #picked by looking at chart; corresponds to 11AM 2020-06-01
plt.show()
​
[RTX Minute Data](https://preview.redd.it/0sfcwliygla61.png?width=960&format=png&auto=webp&s=a100739beecc8055f7a07ee44b1c1ea96886ac92)
At time=0 we were coming off of the March lows and I was watching to see where the then new RTX would end up following the merger (4/3). I liked the company and after the low and following peak, I was interested in beginning to trade/accumulate shares. Given the nature of air travel and the pandemic, I assumed that it would trade sideways for some time and the next natural step was to find an entry point and exit point. Looking at the chart, it is difficult to take any real information away that can be of help.
So, Let's create a new chart that weights price by volume and consolidates it on a daily or bidaily time frame. I chose to use a 2D Histogram. Note that the vertical red corresponds to day 39 in our "days" array. Here is the code:
def hist_params(data,days):
p,inds,v = [],[],[]
for i in range(len(days)):
tmp = data.loc[data.TradingDay==days[i]]
c = tmp.Last.values
vv = tmp.Volume.values
ind = np.full(len(c),i,dtype=int)
p.append(c),inds.append(ind),v.append(vv)
prices = np.round(np.concatenate(p))
inds = np.concatenate(inds)
v = np.concatenate(v)
return inds,prices,v
def plot_h(x,y,w,lookback=1):
y_bins = int(y.max()-y.min())
x_bins = (len(x)//390)//lookback
plt.hist2d(x,y,bins=(x_bins,y_bins),density=True,weights=w,cmap='jet')
plt.colorbar()
plt.show()
x,y,wts = hist_params(data,days[:39]) #Pretending we can't see the whole chart
plot_h(x,y,wts,lookback=2) #lookback combines bins on x axis (time)
​
[2D Histogram of the first 38 Days](https://preview.redd.it/7u6dqju0kla61.png?width=960&format=png&auto=webp&s=b3e6c5bdbb6f0041b0a4cf9583c832974f8372aa)
Lowest volume is dark blue to highest volume at dark red.
This is a good bit easier to look at than the minute data above. At the high end, volume drops off at around 65 dollars a share and we see strong buying at \~58 dollars a share on the lower end with very strong volume at the post crash low at 52. Volume is decent at around the 62.00 level, which is what I would call our current value based on price action.
To recap:
1. We believe RTX has strong fundamentals and the price is expected to increase over time.
2. We have identified a time when we believe it will be trading in a ranging pattern due to current circumstances and uncertainty.
3. We have assigned a value of 62.00 to the equity for the time being.
4. We have assigned a low value of 58 and floor of 52.
5. We have assigned a high value of 65.
Now, we need to create a strategy. We don't currently own any shares, so let's look at buying first. If the spot price is at 62, we aren't getting a bargain, but it is a fair price. As the value drops lower, we would want to buy more all the way to our floor of 52. One strategy would be to set limit orders at each dollar value, with the number of shares increasing with lower price. The limit orders (depending on available capital and other strategies) would reset as the price goes back up.
Once we have a buying strategy, we would choose a strategy for selling shares. Let's say I want to accumulate shares but also take some profit to lower overall variance and decrease my overall level of risk. As the price rebounds, we do the same thing in reverse. Offload some shares at 62 all the way to 65 keeping some quantity of shares. Or sell covered calls at 65 as the price reaches 62, etc. I prefer calls assuming sufficient capital to buy enough shares.
That's really the gist of it. This isn't a high frequency strategy but leverages the advantages that retail traders have, namely being able to buy and sell quickly without moving the price as well as good old fashioned patience. Choosing a stock with a dividend offers a bonus if you are stuck holding but complicates the covered call slightly.
Bonus for reading this far (if you didn't, look away): An introduction to KDE
I've always liked the idea of getting as much information as possible into a simple to view chart. One thing we can do with minute data is combine it to create daily data that looks at where the majority of the price action took place, rather than where the price ended up at the end of the day.
KDE stands for kernel density approximation. Normal distributions are commonly used as the kernel. Essentially what this method does is applies a normal distribution where location = current value and then sums the results giving an approximation of the density. There is more to it but that's the gist. Let's look at some code and resultant plots (check formatting where lines overran when copying):
def create_daily(data,days):
daily,close = [],[]
for day in days:
temp = data.loc[data.TradingDay==day]
c = temp.Last.values
v = temp.Volume.values
v/=v.sum()
xr = np.linspace(c.min(),c.max(),500)
kde = gaussian_kde(c,weights=v)
kdx = kde(xr)
pks = find_peaks(kdx)[0]
if len(pks)>1:
density = kdx[pks]
prices = xr[pks]
wavg = np.average(prices,weights=density)
daily.append(wavg)
close.append(c[-1])
else:
daily.append(xr[pks[0]])
close.append(c[-1])
daily = np.array(daily)
close = np.array(close)
return daily,close
def compare_daily(daily,close,vol=False):
if vol==False:
fig,(ax1,ax2) = plt.subplots(2,sharex=True)
ax1.plot(close,'b',label='End of Day Close')
ax1.plot(daily,'r',alpha=.8,label='Daily Volume Weighted')
ax1.set_title("Comparison of Daily Close to Daily Volume Weighted Price")
ax1.set_ylabel("Price")
ax1.legend(fontsize='xx-large')
dd = close-daily
ax2.plot(dd,'b')
ax2.axhline(0,color='r',linewidth=2)
ax2.set_title("Difference between Daily Close and Volume Weighted")
ax2.set_xlabel("Days")
ax2.set_ylabel("Difference")
else:
def gmodel(log_returns): return arch_model(log_returns*1000.,vol='GARCH',p=1,q=1,dist='StudentsT') #should be ^
dlr,clr = np.diff(np.log(daily)),np.diff(np.log(close))
dv = gmodel(dlr).fit(update_freq=5).conditional_volatility
cv = gmodel(clr).fit(update_freq=5).conditional_volatility
plt.plot(cv,'b',label="Daily Close")
plt.plot(dv,'r',alpha=.8,label="Daily Volume Weighted")
plt.title("Daily Volatility")
plt.legend()
plt.grid(True)
plt.show()
​
[Comparing Generated Daily Data vs. Daily Last Close](https://preview.redd.it/ukjnvgo9qla61.png?width=960&format=png&auto=webp&s=d9aa10ca247669431828d37860118d1915f1f830)
And a comparison of the volatility between the two:
​
[GARCH \(1,1\) for Each Dataset](https://preview.redd.it/szik4zxcqla61.png?width=960&format=png&auto=webp&s=38eff958be89b9f0a75125cde7e8a48adf8d28b5)
Hope you enjoyed the tutorial!