Generalized Moving Average Kernels
A moving averages is a very simple concept that traders often take for granted and do not consider the inner mechanics of. In a very generalized sense a moving average for the last n periods is something that combines the past n values with unique weightings for each value. The real power of a moving average is in how those weightings are chosen. In a larger sense our choice of weightings is called a "kernel" or an "envelope". So if we consider a simple moving average all the weightings are the same, which means that our calculation equally considers past price action and current price action, this has a flat kernel. A weighted or linear moving average (wma) has a kernel that is shaped like a line and is decreasing as the distance from the current bar increases that takes the form of y = mx+b. This means that the weighting is higher for more recent bars and less for historic bars; increasing the slope (the value "m") of this will make this kernel more sensitive to recent bars than past bars. The exponential moving average (ema) is theoretically just like the wma but with an exponential term, aka y = ax^2 + mx + b. increasing the value of "a" will make the average exponentially more sensitive to recent price action than past bars. These are just 3 examples of the most common kernels. However the choice in kernels can be entirely your choice, and this is what I am presenting to the tradingview community. These methods are rather common in the field of signal processing and hopefully trading sometime soon.
Here I have built 3 new kernels for everyone in an indicator I will release soon.
1. The generalized polynomial kernel (blue)
Whereas the wma is defined by y = mx + b, the ema by y = ax^2 + mx + b, the generalized polynomial kernel can take in an eighth order polynomial as a kennel function: y = sum (rx ^ i) where i ranges from 0 to 8 and the user has 9 coefficients "r". To make a wma here one just sets the last 6 values of r to zero, or to make an ema the user sets the last 5 values of r to zero. If you are curious what shape your polynomial makes you can just plug it into wolfram or google to see it. This is the blue line on the chart above with all coefficients set to 1 by default.
2. The gaussian kernel (red)
This option sets the moving average kernel to a gaussian. The important thing here to consider is where it is centered, and how broad it is. If the width of the gaussian is sufficiently larger than the moving average window size then you will start to approximate a simple moving average, however if the width of the gaussian is incredibly narrow you are basically sampling the bars from however long ago that your gaussian is centered, like creating an offset. If the centering is done closer to the recent bars then there is essentially a smooth drop off in weightings with a negative concavity. This is the red line on the chart.
3. The noise kernel (green)
The idea of this one is simple, to just make a random kernel. Any value of the kernel can have a vastly different weight than the neighboring kernels. As tradingview has no random number generator I used a quasi random one that multiplies the unix time with the price and takes the sine function of that. For being totally random it also appears to be useful. This is the green line on the chart.
The script for this will be coming soon, I just have to clean it up for everyone. Keep in mind that this indicator is not ready to just apply to the charts, it is designed for people to customize and mess with first.
If anyone has any ideas to test with this I am incredibly interested to explore this deeper. I am using this general idea to move onto very interesting and potentially powerful applications, if anyone wants to talk about the technicalities of these please feel free to message me.
Mcbw
Current ResearchI've been getting some messages about possible collaborations and questions about what I'm working on, so heres a few of my current projects that I will start working on in the next few months. If you have a good background in programming or maths and have interest in these projects feel free message me!
I- Holy Grail
a) Making the Holy Grail pick peaks and valleys better
b) Define zones of ranging, when price leaves these zones then enter the trend trade, range trade in the zone
c) Developing classes of new dynamic modulators, based on price and/or volume
d) Solving the dominate periods of price, the largest coefficients of the fourier series
e) Optimized scaling techniques based on unrealized p/l
II- Horizontal Logic
a) Identify horizontal lines that when crossed have a certain probability of crossing a next line
i) High probability lines will be targets
ii) Low probability lines are a good way to probabilistically define ranges
b) Find adaptive triggers that can identify good lines that will be crossed
c) Quantify these in time
d) Classify ranges and trends
III- Ichimoku Methods
a) Make excessive modifications to cloud for very high probability strategies
b) Optimized scaling techniques based on cloud/price information
c) Forward test, then automate
Cashing out on market asymmetries Skip to the second paragraph to get to the point, first paragraph discusses observations...
What you see in the bottom of the screen is a script I recently published just for fun; it's a momentum measurement for bitcoin against several fiat pairs and crypto pairs, and all summed together. The top of the screen is essentially the btc price, but somewhat more reactive. What is really interesting here is that on the morning of 21.11 the red line (which represents the sum of bitcoin momentum across multiple pairs) took a steep dive to an order of magnitude less than it has often been at (bottom pink circle), an obvious red flag for long holders. Then 3 hours later the XBTUSD price took a quick decline as well! The steep decline in the momentum looks largely responsible by the DASHBTC pair, but took a few hours to show in the BTCUSD pair. The reason so many prices of cryptos are stable is because of the work of the people doing arbitrage, this normalizes the market. However in this case it took longer than expected, possibly for a number of reasons, due to time of day, blockchain verification delays, or simply there are not so many arbitrage bots on the DASHBTC pair, or any other reason... Then 3 hours later the bitcoin price followed suit. The insane divergence of DASHBTC from the overall value of BTC can be taken in only 2 ways: Dash has a sudden market strength, or money is moving from BTC to DASH, possibly foreshadowing a decline in BTC.
What would be interesting is a suite of indicators that measure how volume moves through a specific crypto to find its overall worth (ie. multiple exchanges and multiple pairings). If a lot of money is being poured into, or out of, one pair then the overall value may also be effected with a time delay. Enough of a time delay to get a great position (like the 3 hours mentioned above). I will need to think about how to appropriately measure this, normalizations schemes, volumes, momentum, price, ect... If anyone wishes to discuss ideas related to this I would be more than happy to talk.
Cheers!
Unique arbitrage oppurtunity What you are looking at here is an interesting ratio. First part is the ETHUSD/ETHEUR ratio, as given by Kraken. This simply gives the EURUSD ratio in terms of ETH. Theoretically, this should be the exact EURUSD as given by forex, but it sometimes isn't. So if we look at the (ETHUSD/ETHEUR) ratio and compare it to the real EURUSD ratio, it looks like this:
ETHUSD/ETHEUR/EURUSD. When it is perfectly 1.00 there is no arbitrage opportunities, but when including commission and slippage it seems to only be profitable above 1.01 and below 0.99. In practice this would be executed in an API being triggered to move EUR to USD through ETH when the ratio is at or above 1.02 and then back to EUR through ETH when the ratio is at or below 0.98. This would roughly give you a 3% profit in your EUR holdings when this is triggered. Of course this can be done between any 3 currency pairs. BTCUSD/BTCEUR/EURUSD, BTCXRP/BTCETH/ETHXRP, or anything of the form ab/ac/cb.
Adaptive Derivative AnalysisThe idea of a derivative is powerful and especially useful in trading. We often don't care so much what the price is, but how it moves. For that reason the velocity, acceleration, and rarely the jerk of price is valuable information. There are a lot of numerical issues with taking time derivatives of price, the biggest of which, in my eyes, is that all transforms and filters must be causal. Because we can't have have central differences, all current information about the future must come from the past. The obvious way around this is to focus on really long term moving average derivatives, as this minimizes the noise and provides some amount of certainty about the next few price bars. The obvious issue with this is that sometimes the price moves very quickly and enormous opportunities can be missed for focusing on the bigger picture. Likewise focusing on the local picture leads to whipsaws, over-trading, and really messy time derivatives. Currently to look into this I am working on a way to adaptively move through different averaging windows to give us information about the most useful derivatives. This is kind of a new momentum indicator. Here it is shown operating on the GBPUSD pair in the bottom window. I will not explain this indicator too much in this post, but I will be releasing it soon with more information on how to use it. Though briefly, the black line represents the momentum, and as the lines/clouds move through it it gives useful information about what happens next. Take a look at how red, orange, and yellow clouds move above, through, and under the black line before breakouts
More to come...
Cheers!