GKD-B Multi-Ticker Stepped Baseline [Loxx]Giga Kaleidoscope GKD-B Multi-Ticker Stepped Baseline is a Baseline module included in Loxx's "Giga Kaleidoscope Modularized Trading System".
This version of the GKD-B Baseline is designed specifically to support traders who wish to conduct GKD-BT Multi-Ticker Backtests with multiple tickers. This functionality is exclusive to the GKD-BT Multi-Ticker Backtests.
Traders have the capability to apply a filter to the selected moving average, leveraging various volatility metrics to enhance trend identification. This feature is tailored for traders favoring a gradual and consistent approach, enabling them to discern more sustainable trends. The system permits filtering for both the input data and the moving average results, requiring price movements to exceed a specific threshold—defined as multiples of the volatility—before acknowledging a trend change. This mechanism effectively reduces false signals caused by market noise and lateral movements. A distinctive aspect of this tool is its ability to adjust both price and moving average data based on volatility indicators like VIX, EUVIX, BVIV, and EVIV, among others. Understanding the time frame over which a volatility index is measured is crucial; for instance, VIX is measured on an annual basis, whereas BVIV and EVIV are based on a 30-day period. To accurately convert these measurements to a daily scale, users must input the correct "days per year" value: 252 for VIX and 30 for BVIV and EVIV. Future updates will introduce additional functionality to extend analysis across various time frames, but currently, this feature is solely available for daily time frame analysis.
█ GKD-B Multi-Ticker Stepped Baseline includes 65+ different moving averages:
Adaptive Moving Average - AMA
ADXvma - Average Directional Volatility Moving Average
Ahrens Moving Average
Alexander Moving Average - ALXMA
Deviation Scaled Moving Average - DSMA
Donchian
Double Exponential Moving Average - DEMA
Double Smoothed Exponential Moving Average - DSEMA
Double Smoothed FEMA - DSFEMA
Double Smoothed Range Weighted EMA - DSRWEMA
Double Smoothed Wilders EMA - DSWEMA
Double Weighted Moving Average - DWMA
Ehlers Optimal Tracking Filter - EOTF
Exponential Moving Average - EMA
Fast Exponential Moving Average - FEMA
Fractal Adaptive Moving Average - FRAMA
Generalized DEMA - GDEMA
Generalized Double DEMA - GDDEMA
Hull Moving Average (Type 1) - HMA1
Hull Moving Average (Type 2) - HMA2
Hull Moving Average (Type 3) - HMA3
Hull Moving Average (Type 4) - HMA4
IE /2 - Early T3 by Tim Tilson
Integral of Linear Regression Slope - ILRS
Kaufman Adaptive Moving Average - KAMA
Laguerre Filter
Leader Exponential Moving Average
Linear Regression Value - LSMA ( Least Squares Moving Average )
Linear Weighted Moving Average - LWMA
McGinley Dynamic
McNicholl EMA
Non-Lag Moving Average
Ocean NMA Moving Average - ONMAMA
One More Moving Average - OMA
Parabolic Weighted Moving Average
Probability Density Function Moving Average - PDFMA
Quadratic Regression Moving Average - QRMA
Regularized EMA - REMA
Range Weighted EMA - RWEMA
Recursive Moving Trendline
Simple Decycler - SDEC
Simple Jurik Moving Average - SJMA
Simple Moving Average - SMA
Sine Weighted Moving Average
Smoothed LWMA - SLWMA
Smoothed Moving Average - SMMA
Smoother
Super Smoother
T3
Three-pole Ehlers Butterworth
Three-pole Ehlers Smoother
Triangular Moving Average - TMA
Triple Exponential Moving Average - TEMA
Two-pole Ehlers Butterworth
Two-pole Ehlers smoother
Variable Index Dynamic Average - VIDYA
Variable Moving Average - VMA
Volume Weighted EMA - VEMA
Volume Weighted Moving Average - VWMA
Zero-Lag DEMA - Zero Lag Exponential Moving Average
Zero-Lag Moving Average
Zero Lag TEMA - Zero Lag Triple Exponential Moving Average
Geometric Mean Moving Average
Coral
Tether Lines
Range Filter
Triangle Moving Average Generalized
Ultinate Smoother
Adaptive Moving Average - AMA
The Adaptive Moving Average (AMA) is a moving average that changes its sensitivity to price moves depending on the calculated volatility. It becomes more sensitive during periods when the price is moving smoothly in a certain direction and becomes less sensitive when the price is volatile.
ADXvma - Average Directional Volatility Moving Average
Linnsoft's ADXvma formula is a volatility-based moving average, with the volatility being determined by the value of the ADX indicator.
The ADXvma has the SMA in Chande's CMO replaced with an EMA , it then uses a few more layers of EMA smoothing before the "Volatility Index" is calculated.
A side effect is, those additional layers slow down the ADXvma when you compare it to Chande's Variable Index Dynamic Average VIDYA .
The ADXVMA provides support during uptrends and resistance during downtrends and will stay flat for longer, but will create some of the most accurate market signals when it decides to move.
Ahrens Moving Average
Richard D. Ahrens's Moving Average promises "Smoother Data" that isn't influenced by the occasional price spike. It works by using the Open and the Close in his formula so that the only time the Ahrens Moving Average will change is when the candlestick is either making new highs or new lows.
Alexander Moving Average - ALXMA
This Moving Average uses an elaborate smoothing formula and utilizes a 7 period Moving Average. It corresponds to fitting a second-order polynomial to seven consecutive observations. This moving average is rarely used in trading but is interesting as this Moving Average has been applied to diffusion indexes that tend to be very volatile.
Deviation Scaled Moving Average - DSMA
The Deviation-Scaled Moving Average is a data smoothing technique that acts like an exponential moving average with a dynamic smoothing coefficient. The smoothing coefficient is automatically updated based on the magnitude of price changes. In the Deviation-Scaled Moving Average, the standard deviation from the mean is chosen to be the measure of this magnitude. The resulting indicator provides substantial smoothing of the data even when price changes are small while quickly adapting to these changes.
Donchian
Donchian Channels are three lines generated by moving average calculations that comprise an indicator formed by upper and lower bands around a midrange or median band. The upper band marks the highest price of a security over N periods while the lower band marks the lowest price of a security over N periods.
Double Exponential Moving Average - DEMA
The Double Exponential Moving Average ( DEMA ) combines a smoothed EMA and a single EMA to provide a low-lag indicator. It's primary purpose is to reduce the amount of "lagging entry" opportunities, and like all Moving Averages, the DEMA confirms uptrends whenever price crosses on top of it and closes above it, and confirms downtrends when the price crosses under it and closes below it - but with significantly less lag.
Double Smoothed Exponential Moving Average - DSEMA
The Double Smoothed Exponential Moving Average is a lot less laggy compared to a traditional EMA . It's also considered a leading indicator compared to the EMA , and is best utilized whenever smoothness and speed of reaction to market changes are required.
Double Smoothed FEMA - DSFEMA
Same as the Double Exponential Moving Average (DEMA), but uses a faster version of EMA for its calculation.
Double Smoothed Range Weighted EMA - DSRWEMA
Range weighted exponential moving average (EMA) is, unlike the "regular" range weighted average calculated in a different way. Even though the basis - the range weighting - is the same, the way how it is calculated is completely different. By definition this type of EMA is calculated as a ratio of EMA of price*weight / EMA of weight. And the results are very different and the two should be considered as completely different types of averages. The higher than EMA to price changes responsiveness when the ranges increase remains in this EMA too and in those cases this EMA is clearly leading the "regular" EMA. This version includes double smoothing.
Double Smoothed Wilders EMA - DSWEMA
Welles Wilder was frequently using one "special" case of EMA (Exponential Moving Average) that is due to that fact (that he used it) sometimes called Wilder's EMA. This version is adding double smoothing to Wilder's EMA in order to make it "faster" (it is more responsive to market prices than the original) and is still keeping very smooth values.
Double Weighted Moving Average - DWMA
Double weighted moving average is an LWMA (Linear Weighted Moving Average). Instead of doing one cycle for calculating the LWMA, the indicator is made to cycle the loop 2 times. That produces a smoother values than the original LWMA
Ehlers Optimal Tracking Filter - EOTF
The Elher's Optimum Tracking Filter quickly adjusts rapid shifts in the price and yet is relatively smooth when the price has a sideways action. The operation of this filter is similar to Kaufman’s Adaptive Moving
Average
Exponential Moving Average - EMA
The EMA places more significance on recent data points and moves closer to price than the SMA ( Simple Moving Average ). It reacts faster to volatility due to its emphasis on recent data and is known for its ability to give greater weight to recent and more relevant data. The EMA is therefore seen as an enhancement over the SMA .
Fast Exponential Moving Average - FEMA
An Exponential Moving Average with a short look-back period.
Fractal Adaptive Moving Average - FRAMA
The Fractal Adaptive Moving Average by John Ehlers is an intelligent adaptive Moving Average which takes the importance of price changes into account and follows price closely enough to display significant moves whilst remaining flat if price ranges. The FRAMA does this by dynamically adjusting the look-back period based on the market's fractal geometry.
Generalized DEMA - GDEMA
The double exponential moving average (DEMA), was developed by Patrick Mulloy in an attempt to reduce the amount of lag time found in traditional moving averages. It was first introduced in the February 1994 issue of the magazine Technical Analysis of Stocks & Commodities in Mulloy's article "Smoothing Data with Faster Moving Averages.". Instead of using fixed multiplication factor in the final DEMA formula, the generalized version allows you to change it. By varying the "volume factor" form 0 to 1 you apply different multiplications and thus producing DEMA with different "speed" - the higher the volume factor is the "faster" the DEMA will be (but also the slope of it will be less smooth). The volume factor is limited in the calculation to 1 since any volume factor that is larger than 1 is increasing the overshooting to the extent that some volume factors usage makes the indicator unusable.
Generalized Double DEMA - GDDEMA
The double exponential moving average (DEMA), was developed by Patrick Mulloy in an attempt to reduce the amount of lag time found in traditional moving averages. It was first introduced in the February 1994 issue of the magazine Technical Analysis of Stocks & Commodities in Mulloy's article "Smoothing Data with Faster Moving Averages''. This is an extension of the Generalized DEMA using Tim Tillsons (the inventor of T3) idea, and is using GDEMA of GDEMA for calculation (which is the "middle step" of T3 calculation). Since there are no versions showing that middle step, this version covers that too. The result is smoother than Generalized DEMA, but is less smooth than T3 - one has to do some experimenting in order to find the optimal way to use it, but in any case, since it is "faster" than the T3 (Tim Tillson T3) and still smooth, it looks like a good compromise between speed and smoothness.
Hull Moving Average (Type 1) - HMA1
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses SMA for smoothing.
Hull Moving Average (Type 2) - HMA2
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses EMA for smoothing.
Hull Moving Average (Type 3) - HMA3
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses LWMA for smoothing.
Hull Moving Average (Type 4) - HMA4
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses SMMA for smoothing.
IE /2 - Early T3 by Tim Tilson and T3 new
The T3 moving average is a type of technical indicator used in financial analysis to identify trends in price movements. It is similar to the Exponential Moving Average (EMA) and the Double Exponential Moving Average (DEMA), but uses a different smoothing algorithm.
The T3 moving average is calculated using a series of exponential moving averages that are designed to filter out noise and smooth the data. The resulting smoothed data is then weighted with a non-linear function to produce a final output that is more responsive to changes in trend direction.
The T3 moving average can be customized by adjusting the length of the moving average, as well as the weighting function used to smooth the data. It is commonly used in conjunction with other technical indicators as part of a larger trading strategy.
Integral of Linear Regression Slope - ILRS
A Moving Average where the slope of a linear regression line is simply integrated as it is fitted in a moving window of length N (natural numbers in maths) across the data. The derivative of ILRS is the linear regression slope. ILRS is not the same as a SMA ( Simple Moving Average ) of length N, which is actually the midpoint of the linear regression line as it moves across the data.
Kaufman Adaptive Moving Average - KAMA
Developed by Perry Kaufman, Kaufman's Adaptive Moving Average (KAMA) is a moving average designed to account for market noise or volatility. KAMA will closely follow prices when the price swings are relatively small and the noise is low.
Laguerre Filter
The Laguerre Filter is a smoothing filter which is based on Laguerre polynomials. The filter requires the current price, three prior prices, a user defined factor called Alpha to fill its calculation.
Adjusting the Alpha coefficient is used to increase or decrease its lag and its smoothness.
Leader Exponential Moving Average
The Leader EMA was created by Giorgos E. Siligardos who created a Moving Average which was able to eliminate lag altogether whilst maintaining some smoothness. It was first described during his research paper "MACD Leader" where he applied this to the MACD to improve its signals and remove its lagging issue. This filter uses his leading MACD's "modified EMA" and can be used as a zero lag filter.
Linear Regression Value - LSMA ( Least Squares Moving Average )
LSMA as a Moving Average is based on plotting the end point of the linear regression line. It compares the current value to the prior value and a determination is made of a possible trend, eg. the linear regression line is pointing up or down.
Linear Weighted Moving Average - LWMA
LWMA reacts to price quicker than the SMA and EMA . Although it's similar to the Simple Moving Average , the difference is that a weight coefficient is multiplied to the price which means the most recent price has the highest weighting, and each prior price has progressively less weight. The weights drop in a linear fashion.
McGinley Dynamic
John McGinley created this Moving Average to track prices better than traditional Moving Averages. It does this by incorporating an automatic adjustment factor into its formula, which speeds (or slows) the indicator in trending, or ranging, markets.
McNicholl EMA
Dennis McNicholl developed this Moving Average to use as his center line for his "Better Bollinger Bands" indicator and was successful because it responded better to volatility changes over the standard SMA and managed to avoid common whipsaws.
Non-lag moving average
The Non Lag Moving average follows price closely and gives very quick signals as well as early signals of price change. As a standalone Moving Average, it should not be used on its own, but as an additional confluence tool for early signals.
Ocean NMA Moving Average - ONMAMA
Created by Jim Sloman, the NMA is a moving average that automatically adjusts to volatility without being programmed to do so. For more info, read his guide "Ocean Theory, an Introduction"
One More Moving Average (OMA)
The One More Moving Average (OMA) is a technical indicator that calculates a series of Jurik-style moving averages in order to reduce noise and provide smoother price data. It uses six exponential moving averages to generate the final value, with the length of the moving averages determined by an adaptive algorithm that adjusts to the current market conditions. The algorithm calculates the average period by comparing the signal to noise ratio and using this value to determine the length of the moving averages. The resulting values are used to generate the final value of the OMA, which can be used to identify trends and potential changes in trend direction.
Parabolic Weighted Moving Average
The Parabolic Weighted Moving Average is a variation of the Linear Weighted Moving Average . The Linear Weighted Moving Average calculates the average by assigning different weights to each element in its calculation. The Parabolic Weighted Moving Average is a variation that allows weights to be changed to form a parabolic curve. It is done simply by using the Power parameter of this indicator.
Probability Density Function Moving Average - PDFMA
Probability density function based MA is a sort of weighted moving average that uses probability density function to calculate the weights. By its nature it is similar to a lot of digital filters.
Quadratic Regression Moving Average - QRMA
A quadratic regression is the process of finding the equation of the parabola that best fits a set of data. This moving average is an obscure concept that was posted to Forex forums in around 2008.
Regularized EMA - REMA
The regularized exponential moving average (REMA) by Chris Satchwell is a variation on the EMA (see Exponential Moving Average) designed to be smoother but not introduce too much extra lag.
Range Weighted EMA - RWEMA
This indicator is a variation of the range weighted EMA. The variation comes from a possible need to make that indicator a bit less "noisy" when it comes to slope changes. The method used for calculating this variation is the method described by Lee Leibfarth in his article "Trading With An Adaptive Price Zone".
Recursive Moving Trendline
Dennis Meyers's Recursive Moving Trendline uses a recursive (repeated application of a rule) polynomial fit, a technique that uses a small number of past values estimations of price and today's price to predict tomorrow's price.
Simple Decycler - SDEC
The Ehlers Simple Decycler study is a virtually zero-lag technical indicator proposed by John F. Ehlers. The original idea behind this study (and several others created by John F. Ehlers) is that market data can be considered a continuum of cycle periods with different cycle amplitudes. Thus, trending periods can be considered segments of longer cycles, or, in other words, low-frequency segments. Applying the right filter might help identify these segments.
Simple Loxx Moving Average - SLMA
A three stage moving average combining an adaptive EMA, a Kalman Filter, and a Kauffman adaptive filter.
Simple Moving Average - SMA
The SMA calculates the average of a range of prices by adding recent prices and then dividing that figure by the number of time periods in the calculation average. It is the most basic Moving Average which is seen as a reliable tool for starting off with Moving Average studies. As reliable as it may be, the basic moving average will work better when it's enhanced into an EMA .
Sine Weighted Moving Average
The Sine Weighted Moving Average assigns the most weight at the middle of the data set. It does this by weighting from the first half of a Sine Wave Cycle and the most weighting is given to the data in the middle of that data set. The Sine WMA closely resembles the TMA (Triangular Moving Average).
Smoothed LWMA - SLWMA
A smoothed version of the LWMA
Smoothed Moving Average - SMMA
The Smoothed Moving Average is similar to the Simple Moving Average ( SMA ), but aims to reduce noise rather than reduce lag. SMMA takes all prices into account and uses a long lookback period. Due to this, it's seen as an accurate yet laggy Moving Average.
Smoother
The Smoother filter is a faster-reacting smoothing technique which generates considerably less lag than the SMMA ( Smoothed Moving Average ). It gives earlier signals but can also create false signals due to its earlier reactions. This filter is sometimes wrongly mistaken for the superior Jurik Smoothing algorithm.
Super Smoother
The Super Smoother filter uses John Ehlers’s “Super Smoother” which consists of a Two pole Butterworth filter combined with a 2-bar SMA ( Simple Moving Average ) that suppresses the 22050 Hz Nyquist frequency: A characteristic of a sampler, which converts a continuous function or signal into a discrete sequence.
Three-pole Ehlers Butterworth
The 3 pole Ehlers Butterworth (as well as the Two pole Butterworth) are both superior alternatives to the EMA and SMA . They aim at producing less lag whilst maintaining accuracy. The 2 pole filter will give you a better approximation for price, whereas the 3 pole filter has superior smoothing.
Three-pole Ehlers smoother
The 3 pole Ehlers smoother works almost as close to price as the above mentioned 3 Pole Ehlers Butterworth. It acts as a strong baseline for signals but removes some noise. Side by side, it hardly differs from the Three Pole Ehlers Butterworth but when examined closely, it has better overshoot reduction compared to the 3 pole Ehlers Butterworth.
Triangular Moving Average - TMA
The TMA is similar to the EMA but uses a different weighting scheme. Exponential and weighted Moving Averages will assign weight to the most recent price data. Simple moving averages will assign the weight equally across all the price data. With a TMA (Triangular Moving Average), it is double smoother (averaged twice) so the majority of the weight is assigned to the middle portion of the data.
Triple Exponential Moving Average - TEMA
The TEMA uses multiple EMA calculations as well as subtracting lag to create a tool which can be used for scalping pullbacks. As it follows price closely, its signals are considered very noisy and should only be used in extremely fast-paced trading conditions.
Two-pole Ehlers Butterworth
The 2 pole Ehlers Butterworth (as well as the three pole Butterworth mentioned above) is another filter that cuts out the noise and follows the price closely. The 2 pole is seen as a faster, leading filter over the 3 pole and follows price a bit more closely. Analysts will utilize both a 2 pole and a 3 pole Butterworth on the same chart using the same period, but having both on chart allows its crosses to be traded.
Two-pole Ehlers smoother
A smoother version of the Two pole Ehlers Butterworth. This filter is the faster version out of the 3 pole Ehlers Butterworth. It does a decent job at cutting out market noise whilst emphasizing a closer following to price over the 3 pole Ehlers .
Variable Index Dynamic Average - VIDYA
Variable Index Dynamic Average Technical Indicator ( VIDYA ) was developed by Tushar Chande. It is an original method of calculating the Exponential Moving Average ( EMA ) with the dynamically changing period of averaging.
Variable Moving Average - VMA
The Variable Moving Average (VMA) is a study that uses an Exponential Moving Average being able to automatically adjust its smoothing factor according to the market volatility.
Volume Weighted EMA - VEMA
An EMA that uses a volume and price weighted calculation instead of the standard price input.
Volume Weighted Moving Average - VWMA
A Volume Weighted Moving Average is a moving average where more weight is given to bars with heavy volume than with light volume. Thus the value of the moving average will be closer to where most trading actually happened than it otherwise would be without being volume weighted.
Zero-Lag DEMA - Zero Lag Double Exponential Moving Average
John Ehlers's Zero Lag DEMA's aim is to eliminate the inherent lag associated with all trend following indicators which average a price over time. Because this is a Double Exponential Moving Average with Zero Lag, it has a tendency to overshoot and create a lot of false signals for swing trading. It can however be used for quick scalping or as a secondary indicator for confluence.
Zero-Lag Moving Average
The Zero Lag Moving Average is described by its creator, John Ehlers , as a Moving Average with absolutely no delay. And it's for this reason that this filter will cause a lot of abrupt signals which will not be ideal for medium to long-term traders. This filter is designed to follow price as close as possible whilst de-lagging data instead of basing it on regular data. The way this is done is by attempting to remove the cumulative effect of the Moving Average.
Zero-Lag TEMA - Zero Lag Triple Exponential Moving Average
Just like the Zero Lag DEMA , this filter will give you the fastest signals out of all the Zero Lag Moving Averages. This is useful for scalping but dangerous for medium to long-term traders, especially during market Volatility and news events. Having no lag, this filter also has no smoothing in its signals and can cause some very bizarre behavior when applied to certain indicators.
█ Volatility Goldie Locks Zone
This volatility filter is the standard first pass filter that is used for all NNFX systems despite the additional volatility/volume filter used in step 5. For this filter, price must fall into a range of maximum and minimum values calculated using multiples of volatility. Unlike the standard NNFX systems, this version of volatility filtering is separated from the core Baseline and uses it's own moving average with Loxx's Exotic Source Types.
█ Volatility Types included
The GKD system utilizes volatility-based take profits and stop losses. Each take profit and stop loss is calculated as a multiple of volatility. You can change the values of the multipliers in the settings as well.
This module includes 17 types of volatility:
Close-to-Close
Parkinson
Garman-Klass
Rogers-Satchell
Yang-Zhang
Garman-Klass-Yang-Zhang
Exponential Weighted Moving Average
Standard Deviation of Log Returns
Pseudo GARCH(2,2)
Average True Range
True Range Double
Standard Deviation
Adaptive Deviation
Median Absolute Deviation
Efficiency-Ratio Adaptive ATR
Mean Absolute Deviation
Static Percent
Various volatility estimators and indicators that investors and traders can use to measure the dispersion or volatility of a financial instrument's price. Each estimator has its strengths and weaknesses, and the choice of estimator should depend on the specific needs and circumstances of the user.
Volatility Ticker Selection
Import volatility tickers like VIX, EUVIX, BVIV, and EVIV.
Close-to-Close
Close-to-Close volatility is a classic and widely used volatility measure, sometimes referred to as historical volatility.
Volatility is an indicator of the speed of a stock price change. A stock with high volatility is one where the price changes rapidly and with a larger amplitude. The more volatile a stock is, the riskier it is.
Close-to-close historical volatility is calculated using only a stock's closing prices. It is the simplest volatility estimator. However, in many cases, it is not precise enough. Stock prices could jump significantly during a trading session and return to the opening value at the end. That means that a considerable amount of price information is not taken into account by close-to-close volatility.
Despite its drawbacks, Close-to-Close volatility is still useful in cases where the instrument doesn't have intraday prices. For example, mutual funds calculate their net asset values daily or weekly, and thus their prices are not suitable for more sophisticated volatility estimators.
Parkinson
Parkinson volatility is a volatility measure that uses the stock’s high and low price of the day.
The main difference between regular volatility and Parkinson volatility is that the latter uses high and low prices for a day, rather than only the closing price. This is useful as close-to-close prices could show little difference while large price movements could have occurred during the day. Thus, Parkinson's volatility is considered more precise and requires less data for calculation than close-to-close volatility.
One drawback of this estimator is that it doesn't take into account price movements after the market closes. Hence, it systematically undervalues volatility. This drawback is addressed in the Garman-Klass volatility estimator.
Garman-Klass
Garman-Klass is a volatility estimator that incorporates open, low, high, and close prices of a security.
Garman-Klass volatility extends Parkinson's volatility by taking into account the opening and closing prices. As markets are most active during the opening and closing of a trading session, it makes volatility estimation more accurate.
Garman and Klass also assumed that the process of price change follows a continuous diffusion process (Geometric Brownian motion). However, this assumption has several drawbacks. The method is not robust for opening jumps in price and trend movements.
Despite its drawbacks, the Garman-Klass estimator is still more effective than the basic formula since it takes into account not only the price at the beginning and end of the time interval but also intraday price extremes.
Researchers Rogers and Satchell have proposed a more efficient method for assessing historical volatility that takes into account price trends. See Rogers-Satchell Volatility for more detail.
Rogers-Satchell
Rogers-Satchell is an estimator for measuring the volatility of securities with an average return not equal to zero.
Unlike Parkinson and Garman-Klass estimators, Rogers-Satchell incorporates a drift term (mean return not equal to zero). As a result, it provides better volatility estimation when the underlying is trending.
The main disadvantage of this method is that it does not take into account price movements between trading sessions. This leads to an underestimation of volatility since price jumps periodically occur in the market precisely at the moments between sessions.
A more comprehensive estimator that also considers the gaps between sessions was developed based on the Rogers-Satchel formula in the 2000s by Yang-Zhang. See Yang Zhang Volatility for more detail.
Yang-Zhang
Yang Zhang is a historical volatility estimator that handles both opening jumps and the drift and has a minimum estimation error.
Yang-Zhang volatility can be thought of as a combination of the overnight (close-to-open volatility) and a weighted average of the Rogers-Satchell volatility and the day’s open-to-close volatility. It is considered to be 14 times more efficient than the close-to-close estimator.
Garman-Klass-Yang-Zhang
Garman-Klass-Yang-Zhang (GKYZ) volatility estimator incorporates the returns of open, high, low, and closing prices in its calculation.
GKYZ volatility estimator takes into account overnight jumps but not the trend, i.e., it assumes that the underlying asset follows a Geometric Brownian Motion (GBM) process with zero drift. Therefore, the GKYZ volatility estimator tends to overestimate the volatility when the drift is different from zero. However, for a GBM process, this estimator is eight times more efficient than the close-to-close volatility estimator.
Exponential Weighted Moving Average
The Exponentially Weighted Moving Average (EWMA) is a quantitative or statistical measure used to model or describe a time series. The EWMA is widely used in finance, with the main applications being technical analysis and volatility modeling.
The moving average is designed such that older observations are given lower weights. The weights decrease exponentially as the data point gets older – hence the name exponentially weighted.
The only decision a user of the EWMA must make is the parameter lambda. The parameter decides how important the current observation is in the calculation of the EWMA. The higher the value of lambda, the more closely the EWMA tracks the original time series.
Standard Deviation of Log Returns
This is the simplest calculation of volatility. It's the standard deviation of ln(close/close(1)).
Pseudo GARCH(2,2)
This is calculated using a short- and long-run mean of variance multiplied by ?.
avg(var;M) + (1 ? ?) avg(var;N) = 2?var/(M+1-(M-1)L) + 2(1-?)var/(M+1-(M-1)L)
Solving for ? can be done by minimizing the mean squared error of estimation; that is, regressing L^-1var - avg(var; N) against avg(var; M) - avg(var; N) and using the resulting beta estimate as ?.
Average True Range
The average true range (ATR) is a technical analysis indicator, introduced by market technician J. Welles Wilder Jr. in his book New Concepts in Technical Trading Systems, that measures market volatility by decomposing the entire range of an asset price for that period.
The true range indicator is taken as the greatest of the following: current high less the current low; the absolute value of the current high less the previous close; and the absolute value of the current low less the previous close. The ATR is then a moving average, generally using 14 days, of the true ranges.
True Range Double
A special case of ATR that attempts to correct for volatility skew.
Standard Deviation
Standard deviation is a statistic that measures the dispersion of a dataset relative to its mean and is calculated as the square root of the variance. The standard deviation is calculated as the square root of variance by determining each data point's deviation relative to the mean. If the data points are further from the mean, there is a higher deviation within the data set; thus, the more spread out the data, the higher the standard deviation.
Adaptive Deviation
By definition, the Standard Deviation (STD, also represented by the Greek letter sigma ? or the Latin letter s) is a measure that is used to quantify the amount of variation or dispersion of a set of data values. In technical analysis, we usually use it to measure the level of current volatility.
Standard Deviation is based on Simple Moving Average calculation for mean value. This version of standard deviation uses the properties of EMA to calculate what can be called a new type of deviation, and since it is based on EMA, we can call it EMA deviation. Additionally, Perry Kaufman's efficiency ratio is used to make it adaptive (since all EMA type calculations are nearly perfect for adapting).
The difference when compared to the standard is significant--not just because of EMA usage, but the efficiency ratio makes it a "bit more logical" in very volatile market conditions.
Median Absolute Deviation
The median absolute deviation is a measure of statistical dispersion. Moreover, the MAD is a robust statistic, being more resilient to outliers in a data set than the standard deviation. In the standard deviation, the distances from the mean are squared, so large deviations are weighted more heavily, and thus outliers can heavily influence it. In the MAD, the deviations of a small number of outliers are irrelevant.
Because the MAD is a more robust estimator of scale than the sample variance or standard deviation, it works better with distributions without a mean or variance, such as the Cauchy distribution.
For this indicator, a manual recreation of the quantile function in Pine Script is used. This is so users have a full inside view into how this is calculated.
Efficiency-Ratio Adaptive ATR
Average True Range (ATR) is a widely used indicator for many occasions in technical analysis. It is calculated as the RMA of the true range. This version adds a "twist": it uses Perry Kaufman's Efficiency Ratio to calculate adaptive true range.
Mean Absolute Deviation
The mean absolute deviation (MAD) is a measure of variability that indicates the average distance between observations and their mean. MAD uses the original units of the data, which simplifies interpretation. Larger values signify that the data points spread out further from the average. Conversely, lower values correspond to data points bunching closer to it. The mean absolute deviation is also known as the mean deviation and average absolute deviation.
This definition of the mean absolute deviation sounds similar to the standard deviation (SD). While both measure variability, they have different calculations. In recent years, some proponents of MAD have suggested that it replace the SD as the primary measure because it is a simpler concept that better fits real life.
█ Giga Kaleidoscope Modularized Trading System
Core components of an NNFX algorithmic trading strategy
The NNFX algorithm is built on the principles of trend, momentum, and volatility. There are six core components in the NNFX trading algorithm:
1. Volatility - price volatility; e.g., Average True Range, True Range Double, Close-to-Close, etc.
2. Baseline - a moving average to identify price trend
3. Confirmation 1 - a technical indicator used to identify trends
4. Confirmation 2 - a technical indicator used to identify trends
5. Continuation - a technical indicator used to identify trends
6. Volatility/Volume - a technical indicator used to identify volatility/volume breakouts/breakdown
7. Exit - a technical indicator used to determine when a trend is exhausted
8. Metamorphosis - a technical indicator that produces a compound signal from the combination of other GKD indicators*
*(not part of the NNFX algorithm)
What is Volatility in the NNFX trading system?
In the NNFX (No Nonsense Forex) trading system, ATR (Average True Range) is typically used to measure the volatility of an asset. It is used as a part of the system to help determine the appropriate stop loss and take profit levels for a trade. ATR is calculated by taking the average of the true range values over a specified period.
True range is calculated as the maximum of the following values:
-Current high minus the current low
-Absolute value of the current high minus the previous close
-Absolute value of the current low minus the previous close
ATR is a dynamic indicator that changes with changes in volatility. As volatility increases, the value of ATR increases, and as volatility decreases, the value of ATR decreases. By using ATR in NNFX system, traders can adjust their stop loss and take profit levels according to the volatility of the asset being traded. This helps to ensure that the trade is given enough room to move, while also minimizing potential losses.
Other types of volatility include True Range Double (TRD), Close-to-Close, and Garman-Klass
What is a Baseline indicator?
The baseline is essentially a moving average, and is used to determine the overall direction of the market.
The baseline in the NNFX system is used to filter out trades that are not in line with the long-term trend of the market. The baseline is plotted on the chart along with other indicators, such as the Moving Average (MA), the Relative Strength Index (RSI), and the Average True Range (ATR).
Trades are only taken when the price is in the same direction as the baseline. For example, if the baseline is sloping upwards, only long trades are taken, and if the baseline is sloping downwards, only short trades are taken. This approach helps to ensure that trades are in line with the overall trend of the market, and reduces the risk of entering trades that are likely to fail.
By using a baseline in the NNFX system, traders can have a clear reference point for determining the overall trend of the market, and can make more informed trading decisions. The baseline helps to filter out noise and false signals, and ensures that trades are taken in the direction of the long-term trend.
What is a Confirmation indicator?
Confirmation indicators are technical indicators that are used to confirm the signals generated by primary indicators. Primary indicators are the core indicators used in the NNFX system, such as the Average True Range (ATR), the Moving Average (MA), and the Relative Strength Index (RSI).
The purpose of the confirmation indicators is to reduce false signals and improve the accuracy of the trading system. They are designed to confirm the signals generated by the primary indicators by providing additional information about the strength and direction of the trend.
Some examples of confirmation indicators that may be used in the NNFX system include the Bollinger Bands, the MACD (Moving Average Convergence Divergence), and the MACD Oscillator. These indicators can provide information about the volatility, momentum, and trend strength of the market, and can be used to confirm the signals generated by the primary indicators.
In the NNFX system, confirmation indicators are used in combination with primary indicators and other filters to create a trading system that is robust and reliable. By using multiple indicators to confirm trading signals, the system aims to reduce the risk of false signals and improve the overall profitability of the trades.
What is a Continuation indicator?
In the NNFX (No Nonsense Forex) trading system, a continuation indicator is a technical indicator that is used to confirm a current trend and predict that the trend is likely to continue in the same direction. A continuation indicator is typically used in conjunction with other indicators in the system, such as a baseline indicator, to provide a comprehensive trading strategy.
What is a Volatility/Volume indicator?
Volume indicators, such as the On Balance Volume (OBV), the Chaikin Money Flow (CMF), or the Volume Price Trend (VPT), are used to measure the amount of buying and selling activity in a market. They are based on the trading volume of the market, and can provide information about the strength of the trend. In the NNFX system, volume indicators are used to confirm trading signals generated by the Moving Average and the Relative Strength Index. Volatility indicators include Average Direction Index, Waddah Attar, and Volatility Ratio. In the NNFX trading system, volatility is a proxy for volume and vice versa.
By using volume indicators as confirmation tools, the NNFX trading system aims to reduce the risk of false signals and improve the overall profitability of trades. These indicators can provide additional information about the market that is not captured by the primary indicators, and can help traders to make more informed trading decisions. In addition, volume indicators can be used to identify potential changes in market trends and to confirm the strength of price movements.
What is an Exit indicator?
The exit indicator is used in conjunction with other indicators in the system, such as the Moving Average (MA), the Relative Strength Index (RSI), and the Average True Range (ATR), to provide a comprehensive trading strategy.
The exit indicator in the NNFX system can be any technical indicator that is deemed effective at identifying optimal exit points. Examples of exit indicators that are commonly used include the Parabolic SAR, and the Average Directional Index (ADX).
The purpose of the exit indicator is to identify when a trend is likely to reverse or when the market conditions have changed, signaling the need to exit a trade. By using an exit indicator, traders can manage their risk and prevent significant losses.
In the NNFX system, the exit indicator is used in conjunction with a stop loss and a take profit order to maximize profits and minimize losses. The stop loss order is used to limit the amount of loss that can be incurred if the trade goes against the trader, while the take profit order is used to lock in profits when the trade is moving in the trader's favor.
Overall, the use of an exit indicator in the NNFX trading system is an important component of a comprehensive trading strategy. It allows traders to manage their risk effectively and improve the profitability of their trades by exiting at the right time.
What is an Metamorphosis indicator?
The concept of a metamorphosis indicator involves the integration of two or more GKD indicators to generate a compound signal. This is achieved by evaluating the accuracy of each indicator and selecting the signal from the indicator with the highest accuracy. As an illustration, let's consider a scenario where we calculate the accuracy of 10 indicators and choose the signal from the indicator that demonstrates the highest accuracy.
The resulting output from the metamorphosis indicator can then be utilized in a GKD-BT backtest by occupying a slot that aligns with the purpose of the metamorphosis indicator. The slot can be a GKD-B, GKD-C, or GKD-E slot, depending on the specific requirements and objectives of the indicator. This allows for seamless integration and utilization of the compound signal within the GKD-BT framework.
How does Loxx's GKD (Giga Kaleidoscope Modularized Trading System) implement the NNFX algorithm outlined above?
Loxx's GKD v2.0 system has five types of modules (indicators/strategies). These modules are:
1. GKD-BT - Backtesting module (Volatility, Number 1 in the NNFX algorithm)
2. GKD-B - Baseline module (Baseline and Volatility/Volume, Numbers 1 and 2 in the NNFX algorithm)
3. GKD-C - Confirmation 1/2 and Continuation module (Confirmation 1/2 and Continuation, Numbers 3, 4, and 5 in the NNFX algorithm)
4. GKD-V - Volatility/Volume module (Confirmation 1/2, Number 6 in the NNFX algorithm)
5. GKD-E - Exit module (Exit, Number 7 in the NNFX algorithm)
6. GKD-M - Metamorphosis module (Metamorphosis, Number 8 in the NNFX algorithm, but not part of the NNFX algorithm)
(additional module types will added in future releases)
Each module interacts with every module by passing data to A backtest module wherein the various components of the GKD system are combined to create a trading signal.
That is, the Baseline indicator passes its data to Volatility/Volume. The Volatility/Volume indicator passes its values to the Confirmation 1 indicator. The Confirmation 1 indicator passes its values to the Confirmation 2 indicator. The Confirmation 2 indicator passes its values to the Continuation indicator. The Continuation indicator passes its values to the Exit indicator, and finally, the Exit indicator passes its values to the Backtest strategy.
This chaining of indicators requires that each module conform to Loxx's GKD protocol, therefore allowing for the testing of every possible combination of technical indicators that make up the six components of the NNFX algorithm.
What does the application of the GKD trading system look like?
Example trading system:
Backtest: Multi-Ticker CC Backtest
Baseline: Hull Moving Average
Volatility/Volume: Hurst Exponent
Confirmation 1: Advance Trend Pressure as shown on the chart above
Confirmation 2: uf2018
Continuation: Coppock Curve
Exit: Rex Oscillator
Metamorphosis: Baseline Optimizer
Each GKD indicator is denoted with a module identifier of either: GKD-BT, GKD-B, GKD-C, GKD-V, GKD-M, or GKD-E. This allows traders to understand to which module each indicator belongs and where each indicator fits into the GKD system.
█ Giga Kaleidoscope Modularized Trading System Signals
Standard Entry
1. GKD-C Confirmation gives signal
2. Baseline agrees
3. Price inside Goldie Locks Zone Minimum
4. Price inside Goldie Locks Zone Maximum
5. Confirmation 2 agrees
6. Volatility/Volume agrees
1-Candle Standard Entry
1a. GKD-C Confirmation gives signal
2a. Baseline agrees
3a. Price inside Goldie Locks Zone Minimum
4a. Price inside Goldie Locks Zone Maximum
Next Candle
1b. Price retraced
2b. Baseline agrees
3b. Confirmation 1 agrees
4b. Confirmation 2 agrees
5b. Volatility/Volume agrees
Baseline Entry
1. GKD-B Baseline gives signal
2. Confirmation 1 agrees
3. Price inside Goldie Locks Zone Minimum
4. Price inside Goldie Locks Zone Maximum
5. Confirmation 2 agrees
6. Volatility/Volume agrees
7. Confirmation 1 signal was less than 'Maximum Allowable PSBC Bars Back' prior
1-Candle Baseline Entry
1a. GKD-B Baseline gives signal
2a. Confirmation 1 agrees
3a. Price inside Goldie Locks Zone Minimum
4a. Price inside Goldie Locks Zone Maximum
5a. Confirmation 1 signal was less than 'Maximum Allowable PSBC Bars Back' prior
Next Candle
1b. Price retraced
2b. Baseline agrees
3b. Confirmation 1 agrees
4b. Confirmation 2 agrees
5b. Volatility/Volume agrees
Volatility/Volume Entry
1. GKD-V Volatility/Volume gives signal
2. Confirmation 1 agrees
3. Price inside Goldie Locks Zone Minimum
4. Price inside Goldie Locks Zone Maximum
5. Confirmation 2 agrees
6. Baseline agrees
7. Confirmation 1 signal was less than 7 candles prior
1-Candle Volatility/Volume Entry
1a. GKD-V Volatility/Volume gives signal
2a. Confirmation 1 agrees
3a. Price inside Goldie Locks Zone Minimum
4a. Price inside Goldie Locks Zone Maximum
5a. Confirmation 1 signal was less than 'Maximum Allowable PSVVC Bars Back' prior
Next Candle
1b. Price retraced
2b. Volatility/Volume agrees
3b. Confirmation 1 agrees
4b. Confirmation 2 agrees
5b. Baseline agrees
Confirmation 2 Entry
1. GKD-C Confirmation 2 gives signal
2. Confirmation 1 agrees
3. Price inside Goldie Locks Zone Minimum
4. Price inside Goldie Locks Zone Maximum
5. Volatility/Volume agrees
6. Baseline agrees
7. Confirmation 1 signal was less than 7 candles prior
1-Candle Confirmation 2 Entry
1a. GKD-C Confirmation 2 gives signal
2a. Confirmation 1 agrees
3a. Price inside Goldie Locks Zone Minimum
4a. Price inside Goldie Locks Zone Maximum
5a. Confirmation 1 signal was less than 'Maximum Allowable PSC2C Bars Back' prior
Next Candle
1b. Price retraced
2b. Confirmation 2 agrees
3b. Confirmation 1 agrees
4b. Volatility/Volume agrees
5b. Baseline agrees
PullBack Entry
1a. GKD-B Baseline gives signal
2a. Confirmation 1 agrees
3a. Price is beyond 1.0x Volatility of Baseline
Next Candle
1b. Price inside Goldie Locks Zone Minimum
2b. Price inside Goldie Locks Zone Maximum
3b. Confirmation 1 agrees
4b. Confirmation 2 agrees
5b. Volatility/Volume agrees
Continuation Entry
1. Standard Entry, 1-Candle Standard Entry, Baseline Entry, 1-Candle Baseline Entry, Volatility/Volume Entry, 1-Candle Volatility/Volume Entry, Confirmation 2 Entry, 1-Candle Confirmation 2 Entry, or Pullback entry triggered previously
2. Baseline hasn't crossed since entry signal trigger
4. Confirmation 1 agrees
5. Baseline agrees
6. Confirmation 2 agrees
Gkdb
GKD-B Multi-Ticker Baseline [Loxx]Giga Kaleidoscope GKD-B Multi-Ticker Baseline is a Baseline module included in Loxx's "Giga Kaleidoscope Modularized Trading System".
This is a special implementation of GKD-B Baseline that allows the trader to input multiple tickers to be passed onto a GKD-BT Multi-Ticker Backtest. This baseline can only be used with the GKD-BT Multi-Ticker Backtests.
GKD-B Multi-Ticker Baseline includes 64 different moving averages:
Adaptive Moving Average - AMA
ADXvma - Average Directional Volatility Moving Average
Ahrens Moving Average
Alexander Moving Average - ALXMA
Deviation Scaled Moving Average - DSMA
Donchian
Double Exponential Moving Average - DEMA
Double Smoothed Exponential Moving Average - DSEMA
Double Smoothed FEMA - DSFEMA
Double Smoothed Range Weighted EMA - DSRWEMA
Double Smoothed Wilders EMA - DSWEMA
Double Weighted Moving Average - DWMA
Ehlers Optimal Tracking Filter - EOTF
Exponential Moving Average - EMA
Fast Exponential Moving Average - FEMA
Fractal Adaptive Moving Average - FRAMA
Generalized DEMA - GDEMA
Generalized Double DEMA - GDDEMA
Hull Moving Average (Type 1) - HMA1
Hull Moving Average (Type 2) - HMA2
Hull Moving Average (Type 3) - HMA3
Hull Moving Average (Type 4) - HMA4
IE /2 - Early T3 by Tim Tilson
Integral of Linear Regression Slope - ILRS
Instantaneous Trendline
Kalman Filter
Kaufman Adaptive Moving Average - KAMA
Laguerre Filter
Leader Exponential Moving Average
Linear Regression Value - LSMA ( Least Squares Moving Average )
Linear Weighted Moving Average - LWMA
McGinley Dynamic
McNicholl EMA
Non-Lag Moving Average
Ocean NMA Moving Average - ONMAMA
One More Moving Average - OMA
Parabolic Weighted Moving Average
Probability Density Function Moving Average - PDFMA
Quadratic Regression Moving Average - QRMA
Regularized EMA - REMA
Range Weighted EMA - RWEMA
Recursive Moving Trendline
Simple Decycler - SDEC
Simple Jurik Moving Average - SJMA
Simple Moving Average - SMA
Sine Weighted Moving Average
Smoothed LWMA - SLWMA
Smoothed Moving Average - SMMA
Smoother
Super Smoother
T3
Three-pole Ehlers Butterworth
Three-pole Ehlers Smoother
Triangular Moving Average - TMA
Triple Exponential Moving Average - TEMA
Two-pole Ehlers Butterworth
Two-pole Ehlers smoother
Variable Index Dynamic Average - VIDYA
Variable Moving Average - VMA
Volume Weighted EMA - VEMA
Volume Weighted Moving Average - VWMA
Zero-Lag DEMA - Zero Lag Exponential Moving Average
Zero-Lag Moving Average
Zero Lag TEMA - Zero Lag Triple Exponential Moving Average
Adaptive Moving Average - AMA
The Adaptive Moving Average (AMA) is a moving average that changes its sensitivity to price moves depending on the calculated volatility. It becomes more sensitive during periods when the price is moving smoothly in a certain direction and becomes less sensitive when the price is volatile.
ADXvma - Average Directional Volatility Moving Average
Linnsoft's ADXvma formula is a volatility-based moving average, with the volatility being determined by the value of the ADX indicator.
The ADXvma has the SMA in Chande's CMO replaced with an EMA , it then uses a few more layers of EMA smoothing before the "Volatility Index" is calculated.
A side effect is, those additional layers slow down the ADXvma when you compare it to Chande's Variable Index Dynamic Average VIDYA .
The ADXVMA provides support during uptrends and resistance during downtrends and will stay flat for longer, but will create some of the most accurate market signals when it decides to move.
Ahrens Moving Average
Richard D. Ahrens's Moving Average promises "Smoother Data" that isn't influenced by the occasional price spike. It works by using the Open and the Close in his formula so that the only time the Ahrens Moving Average will change is when the candlestick is either making new highs or new lows.
Alexander Moving Average - ALXMA
This Moving Average uses an elaborate smoothing formula and utilizes a 7 period Moving Average. It corresponds to fitting a second-order polynomial to seven consecutive observations. This moving average is rarely used in trading but is interesting as this Moving Average has been applied to diffusion indexes that tend to be very volatile.
Deviation Scaled Moving Average - DSMA
The Deviation-Scaled Moving Average is a data smoothing technique that acts like an exponential moving average with a dynamic smoothing coefficient. The smoothing coefficient is automatically updated based on the magnitude of price changes. In the Deviation-Scaled Moving Average, the standard deviation from the mean is chosen to be the measure of this magnitude. The resulting indicator provides substantial smoothing of the data even when price changes are small while quickly adapting to these changes.
Donchian
Donchian Channels are three lines generated by moving average calculations that comprise an indicator formed by upper and lower bands around a midrange or median band. The upper band marks the highest price of a security over N periods while the lower band marks the lowest price of a security over N periods.
Double Exponential Moving Average - DEMA
The Double Exponential Moving Average ( DEMA ) combines a smoothed EMA and a single EMA to provide a low-lag indicator. It's primary purpose is to reduce the amount of "lagging entry" opportunities, and like all Moving Averages, the DEMA confirms uptrends whenever price crosses on top of it and closes above it, and confirms downtrends when the price crosses under it and closes below it - but with significantly less lag.
Double Smoothed Exponential Moving Average - DSEMA
The Double Smoothed Exponential Moving Average is a lot less laggy compared to a traditional EMA . It's also considered a leading indicator compared to the EMA , and is best utilized whenever smoothness and speed of reaction to market changes are required.
Double Smoothed FEMA - DSFEMA
Same as the Double Exponential Moving Average (DEMA), but uses a faster version of EMA for its calculation.
Double Smoothed Range Weighted EMA - DSRWEMA
Range weighted exponential moving average (EMA) is, unlike the "regular" range weighted average calculated in a different way. Even though the basis - the range weighting - is the same, the way how it is calculated is completely different. By definition this type of EMA is calculated as a ratio of EMA of price*weight / EMA of weight. And the results are very different and the two should be considered as completely different types of averages. The higher than EMA to price changes responsiveness when the ranges increase remains in this EMA too and in those cases this EMA is clearly leading the "regular" EMA. This version includes double smoothing.
Double Smoothed Wilders EMA - DSWEMA
Welles Wilder was frequently using one "special" case of EMA (Exponential Moving Average) that is due to that fact (that he used it) sometimes called Wilder's EMA. This version is adding double smoothing to Wilder's EMA in order to make it "faster" (it is more responsive to market prices than the original) and is still keeping very smooth values.
Double Weighted Moving Average - DWMA
Double weighted moving average is an LWMA (Linear Weighted Moving Average). Instead of doing one cycle for calculating the LWMA, the indicator is made to cycle the loop 2 times. That produces a smoother values than the original LWMA
Ehlers Optimal Tracking Filter - EOTF
The Elher's Optimum Tracking Filter quickly adjusts rapid shifts in the price and yet is relatively smooth when the price has a sideways action. The operation of this filter is similar to Kaufman’s Adaptive Moving
Average
Exponential Moving Average - EMA
The EMA places more significance on recent data points and moves closer to price than the SMA ( Simple Moving Average ). It reacts faster to volatility due to its emphasis on recent data and is known for its ability to give greater weight to recent and more relevant data. The EMA is therefore seen as an enhancement over the SMA .
Fast Exponential Moving Average - FEMA
An Exponential Moving Average with a short look-back period.
Fractal Adaptive Moving Average - FRAMA
The Fractal Adaptive Moving Average by John Ehlers is an intelligent adaptive Moving Average which takes the importance of price changes into account and follows price closely enough to display significant moves whilst remaining flat if price ranges. The FRAMA does this by dynamically adjusting the look-back period based on the market's fractal geometry.
Generalized DEMA - GDEMA
The double exponential moving average (DEMA), was developed by Patrick Mulloy in an attempt to reduce the amount of lag time found in traditional moving averages. It was first introduced in the February 1994 issue of the magazine Technical Analysis of Stocks & Commodities in Mulloy's article "Smoothing Data with Faster Moving Averages.". Instead of using fixed multiplication factor in the final DEMA formula, the generalized version allows you to change it. By varying the "volume factor" form 0 to 1 you apply different multiplications and thus producing DEMA with different "speed" - the higher the volume factor is the "faster" the DEMA will be (but also the slope of it will be less smooth). The volume factor is limited in the calculation to 1 since any volume factor that is larger than 1 is increasing the overshooting to the extent that some volume factors usage makes the indicator unusable.
Generalized Double DEMA - GDDEMA
The double exponential moving average (DEMA), was developed by Patrick Mulloy in an attempt to reduce the amount of lag time found in traditional moving averages. It was first introduced in the February 1994 issue of the magazine Technical Analysis of Stocks & Commodities in Mulloy's article "Smoothing Data with Faster Moving Averages''. This is an extension of the Generalized DEMA using Tim Tillsons (the inventor of T3) idea, and is using GDEMA of GDEMA for calculation (which is the "middle step" of T3 calculation). Since there are no versions showing that middle step, this version covers that too. The result is smoother than Generalized DEMA, but is less smooth than T3 - one has to do some experimenting in order to find the optimal way to use it, but in any case, since it is "faster" than the T3 (Tim Tillson T3) and still smooth, it looks like a good compromise between speed and smoothness.
Hull Moving Average (Type 1) - HMA1
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses SMA for smoothing.
Hull Moving Average (Type 2) - HMA2
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses EMA for smoothing.
Hull Moving Average (Type 3) - HMA3
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses LWMA for smoothing.
Hull Moving Average (Type 4) - HMA4
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses SMMA for smoothing.
IE /2 - Early T3 by Tim Tilson and T3 new
The T3 moving average is a type of technical indicator used in financial analysis to identify trends in price movements. It is similar to the Exponential Moving Average (EMA) and the Double Exponential Moving Average (DEMA), but uses a different smoothing algorithm.
The T3 moving average is calculated using a series of exponential moving averages that are designed to filter out noise and smooth the data. The resulting smoothed data is then weighted with a non-linear function to produce a final output that is more responsive to changes in trend direction.
The T3 moving average can be customized by adjusting the length of the moving average, as well as the weighting function used to smooth the data. It is commonly used in conjunction with other technical indicators as part of a larger trading strategy.
Integral of Linear Regression Slope - ILRS
A Moving Average where the slope of a linear regression line is simply integrated as it is fitted in a moving window of length N (natural numbers in maths) across the data. The derivative of ILRS is the linear regression slope. ILRS is not the same as a SMA ( Simple Moving Average ) of length N, which is actually the midpoint of the linear regression line as it moves across the data.
Instantaneous Trendline
The Instantaneous Trendline is created by removing the dominant cycle component from the price information which makes this Moving Average suitable for medium to long-term trading.
Kalman Filter
Kalman filter is an algorithm that uses a series of measurements observed over time, containing statistical noise and other inaccuracies. This means that the filter was originally designed to work with noisy data. Also, it is able to work with incomplete data. Another advantage is that it is designed for and applied in dynamic systems; our price chart belongs to such systems. This version is true to the original design of the trade-ready Kalman Filter where velocity is the triggering mechanism.
Kalman Filter is a more accurate smoothing/prediction algorithm than the moving average because it is adaptive: it accounts for estimation errors and tries to adjust its predictions from the information it learned in the previous stage. Theoretically, Kalman Filter consists of measurement and transition components.
Kaufman Adaptive Moving Average - KAMA
Developed by Perry Kaufman, Kaufman's Adaptive Moving Average (KAMA) is a moving average designed to account for market noise or volatility. KAMA will closely follow prices when the price swings are relatively small and the noise is low.
Laguerre Filter
The Laguerre Filter is a smoothing filter which is based on Laguerre polynomials. The filter requires the current price, three prior prices, a user defined factor called Alpha to fill its calculation.
Adjusting the Alpha coefficient is used to increase or decrease its lag and its smoothness.
Leader Exponential Moving Average
The Leader EMA was created by Giorgos E. Siligardos who created a Moving Average which was able to eliminate lag altogether whilst maintaining some smoothness. It was first described during his research paper "MACD Leader" where he applied this to the MACD to improve its signals and remove its lagging issue. This filter uses his leading MACD's "modified EMA" and can be used as a zero lag filter.
Linear Regression Value - LSMA ( Least Squares Moving Average )
LSMA as a Moving Average is based on plotting the end point of the linear regression line. It compares the current value to the prior value and a determination is made of a possible trend, eg. the linear regression line is pointing up or down.
Linear Weighted Moving Average - LWMA
LWMA reacts to price quicker than the SMA and EMA . Although it's similar to the Simple Moving Average , the difference is that a weight coefficient is multiplied to the price which means the most recent price has the highest weighting, and each prior price has progressively less weight. The weights drop in a linear fashion.
McGinley Dynamic
John McGinley created this Moving Average to track prices better than traditional Moving Averages. It does this by incorporating an automatic adjustment factor into its formula, which speeds (or slows) the indicator in trending, or ranging, markets.
McNicholl EMA
Dennis McNicholl developed this Moving Average to use as his center line for his "Better Bollinger Bands" indicator and was successful because it responded better to volatility changes over the standard SMA and managed to avoid common whipsaws.
Non-lag moving average
The Non Lag Moving average follows price closely and gives very quick signals as well as early signals of price change. As a standalone Moving Average, it should not be used on its own, but as an additional confluence tool for early signals.
Ocean NMA Moving Average - ONMAMA
Created by Jim Sloman, the NMA is a moving average that automatically adjusts to volatility without being programmed to do so. For more info, read his guide "Ocean Theory, an Introduction"
One More Moving Average (OMA)
The One More Moving Average (OMA) is a technical indicator that calculates a series of Jurik-style moving averages in order to reduce noise and provide smoother price data. It uses six exponential moving averages to generate the final value, with the length of the moving averages determined by an adaptive algorithm that adjusts to the current market conditions. The algorithm calculates the average period by comparing the signal to noise ratio and using this value to determine the length of the moving averages. The resulting values are used to generate the final value of the OMA, which can be used to identify trends and potential changes in trend direction.
Parabolic Weighted Moving Average
The Parabolic Weighted Moving Average is a variation of the Linear Weighted Moving Average . The Linear Weighted Moving Average calculates the average by assigning different weights to each element in its calculation. The Parabolic Weighted Moving Average is a variation that allows weights to be changed to form a parabolic curve. It is done simply by using the Power parameter of this indicator.
Probability Density Function Moving Average - PDFMA
Probability density function based MA is a sort of weighted moving average that uses probability density function to calculate the weights. By its nature it is similar to a lot of digital filters.
Quadratic Regression Moving Average - QRMA
A quadratic regression is the process of finding the equation of the parabola that best fits a set of data. This moving average is an obscure concept that was posted to Forex forums in around 2008.
Regularized EMA - REMA
The regularized exponential moving average (REMA) by Chris Satchwell is a variation on the EMA (see Exponential Moving Average) designed to be smoother but not introduce too much extra lag.
Range Weighted EMA - RWEMA
This indicator is a variation of the range weighted EMA. The variation comes from a possible need to make that indicator a bit less "noisy" when it comes to slope changes. The method used for calculating this variation is the method described by Lee Leibfarth in his article "Trading With An Adaptive Price Zone".
Recursive Moving Trendline
Dennis Meyers's Recursive Moving Trendline uses a recursive (repeated application of a rule) polynomial fit, a technique that uses a small number of past values estimations of price and today's price to predict tomorrow's price.
Simple Decycler - SDEC
The Ehlers Simple Decycler study is a virtually zero-lag technical indicator proposed by John F. Ehlers. The original idea behind this study (and several others created by John F. Ehlers) is that market data can be considered a continuum of cycle periods with different cycle amplitudes. Thus, trending periods can be considered segments of longer cycles, or, in other words, low-frequency segments. Applying the right filter might help identify these segments.
Simple Loxx Moving Average - SLMA
A three stage moving average combining an adaptive EMA, a Kalman Filter, and a Kauffman adaptive filter.
Simple Moving Average - SMA
The SMA calculates the average of a range of prices by adding recent prices and then dividing that figure by the number of time periods in the calculation average. It is the most basic Moving Average which is seen as a reliable tool for starting off with Moving Average studies. As reliable as it may be, the basic moving average will work better when it's enhanced into an EMA .
Sine Weighted Moving Average
The Sine Weighted Moving Average assigns the most weight at the middle of the data set. It does this by weighting from the first half of a Sine Wave Cycle and the most weighting is given to the data in the middle of that data set. The Sine WMA closely resembles the TMA (Triangular Moving Average).
Smoothed LWMA - SLWMA
A smoothed version of the LWMA
Smoothed Moving Average - SMMA
The Smoothed Moving Average is similar to the Simple Moving Average ( SMA ), but aims to reduce noise rather than reduce lag. SMMA takes all prices into account and uses a long lookback period. Due to this, it's seen as an accurate yet laggy Moving Average.
Smoother
The Smoother filter is a faster-reacting smoothing technique which generates considerably less lag than the SMMA ( Smoothed Moving Average ). It gives earlier signals but can also create false signals due to its earlier reactions. This filter is sometimes wrongly mistaken for the superior Jurik Smoothing algorithm.
Super Smoother
The Super Smoother filter uses John Ehlers’s “Super Smoother” which consists of a Two pole Butterworth filter combined with a 2-bar SMA ( Simple Moving Average ) that suppresses the 22050 Hz Nyquist frequency: A characteristic of a sampler, which converts a continuous function or signal into a discrete sequence.
Three-pole Ehlers Butterworth
The 3 pole Ehlers Butterworth (as well as the Two pole Butterworth) are both superior alternatives to the EMA and SMA . They aim at producing less lag whilst maintaining accuracy. The 2 pole filter will give you a better approximation for price, whereas the 3 pole filter has superior smoothing.
Three-pole Ehlers smoother
The 3 pole Ehlers smoother works almost as close to price as the above mentioned 3 Pole Ehlers Butterworth. It acts as a strong baseline for signals but removes some noise. Side by side, it hardly differs from the Three Pole Ehlers Butterworth but when examined closely, it has better overshoot reduction compared to the 3 pole Ehlers Butterworth.
Triangular Moving Average - TMA
The TMA is similar to the EMA but uses a different weighting scheme. Exponential and weighted Moving Averages will assign weight to the most recent price data. Simple moving averages will assign the weight equally across all the price data. With a TMA (Triangular Moving Average), it is double smoother (averaged twice) so the majority of the weight is assigned to the middle portion of the data.
Triple Exponential Moving Average - TEMA
The TEMA uses multiple EMA calculations as well as subtracting lag to create a tool which can be used for scalping pullbacks. As it follows price closely, its signals are considered very noisy and should only be used in extremely fast-paced trading conditions.
Two-pole Ehlers Butterworth
The 2 pole Ehlers Butterworth (as well as the three pole Butterworth mentioned above) is another filter that cuts out the noise and follows the price closely. The 2 pole is seen as a faster, leading filter over the 3 pole and follows price a bit more closely. Analysts will utilize both a 2 pole and a 3 pole Butterworth on the same chart using the same period, but having both on chart allows its crosses to be traded.
Two-pole Ehlers smoother
A smoother version of the Two pole Ehlers Butterworth. This filter is the faster version out of the 3 pole Ehlers Butterworth. It does a decent job at cutting out market noise whilst emphasizing a closer following to price over the 3 pole Ehlers .
Variable Index Dynamic Average - VIDYA
Variable Index Dynamic Average Technical Indicator ( VIDYA ) was developed by Tushar Chande. It is an original method of calculating the Exponential Moving Average ( EMA ) with the dynamically changing period of averaging.
Variable Moving Average - VMA
The Variable Moving Average (VMA) is a study that uses an Exponential Moving Average being able to automatically adjust its smoothing factor according to the market volatility.
Volume Weighted EMA - VEMA
An EMA that uses a volume and price weighted calculation instead of the standard price input.
Volume Weighted Moving Average - VWMA
A Volume Weighted Moving Average is a moving average where more weight is given to bars with heavy volume than with light volume. Thus the value of the moving average will be closer to where most trading actually happened than it otherwise would be without being volume weighted.
Zero-Lag DEMA - Zero Lag Double Exponential Moving Average
John Ehlers's Zero Lag DEMA's aim is to eliminate the inherent lag associated with all trend following indicators which average a price over time. Because this is a Double Exponential Moving Average with Zero Lag, it has a tendency to overshoot and create a lot of false signals for swing trading. It can however be used for quick scalping or as a secondary indicator for confluence.
Zero-Lag Moving Average
The Zero Lag Moving Average is described by its creator, John Ehlers , as a Moving Average with absolutely no delay. And it's for this reason that this filter will cause a lot of abrupt signals which will not be ideal for medium to long-term traders. This filter is designed to follow price as close as possible whilst de-lagging data instead of basing it on regular data. The way this is done is by attempting to remove the cumulative effect of the Moving Average.
Zero-Lag TEMA - Zero Lag Triple Exponential Moving Average
Just like the Zero Lag DEMA , this filter will give you the fastest signals out of all the Zero Lag Moving Averages. This is useful for scalping but dangerous for medium to long-term traders, especially during market Volatility and news events. Having no lag, this filter also has no smoothing in its signals and can cause some very bizarre behavior when applied to certain indicators.
█ Volatility Goldie Locks Zone
This volatility filter is the standard first pass filter that is used for all NNFX systems despite the additional volatility/volume filter used in step 5. For this filter, price must fall into a range of maximum and minimum values calculated using multiples of volatility. Unlike the standard NNFX systems, this version of volatility filtering is separated from the core Baseline and uses it's own moving average with Loxx's Exotic Source Types.
█ Volatility Types included
The GKD system utilizes volatility-based take profits and stop losses. Each take profit and stop loss is calculated as a multiple of volatility. You can change the values of the multipliers in the settings as well.
This module includes 17 types of volatility:
Close-to-Close
Parkinson
Garman-Klass
Rogers-Satchell
Yang-Zhang
Garman-Klass-Yang-Zhang
Exponential Weighted Moving Average
Standard Deviation of Log Returns
Pseudo GARCH(2,2)
Average True Range
True Range Double
Standard Deviation
Adaptive Deviation
Median Absolute Deviation
Efficiency-Ratio Adaptive ATR
Mean Absolute Deviation
Static Percent
Various volatility estimators and indicators that investors and traders can use to measure the dispersion or volatility of a financial instrument's price. Each estimator has its strengths and weaknesses, and the choice of estimator should depend on the specific needs and circumstances of the user.
Close-to-Close
Close-to-Close volatility is a classic and widely used volatility measure, sometimes referred to as historical volatility.
Volatility is an indicator of the speed of a stock price change. A stock with high volatility is one where the price changes rapidly and with a larger amplitude. The more volatile a stock is, the riskier it is.
Close-to-close historical volatility is calculated using only a stock's closing prices. It is the simplest volatility estimator. However, in many cases, it is not precise enough. Stock prices could jump significantly during a trading session and return to the opening value at the end. That means that a considerable amount of price information is not taken into account by close-to-close volatility.
Despite its drawbacks, Close-to-Close volatility is still useful in cases where the instrument doesn't have intraday prices. For example, mutual funds calculate their net asset values daily or weekly, and thus their prices are not suitable for more sophisticated volatility estimators.
Parkinson
Parkinson volatility is a volatility measure that uses the stock’s high and low price of the day.
The main difference between regular volatility and Parkinson volatility is that the latter uses high and low prices for a day, rather than only the closing price. This is useful as close-to-close prices could show little difference while large price movements could have occurred during the day. Thus, Parkinson's volatility is considered more precise and requires less data for calculation than close-to-close volatility.
One drawback of this estimator is that it doesn't take into account price movements after the market closes. Hence, it systematically undervalues volatility. This drawback is addressed in the Garman-Klass volatility estimator.
Garman-Klass
Garman-Klass is a volatility estimator that incorporates open, low, high, and close prices of a security.
Garman-Klass volatility extends Parkinson's volatility by taking into account the opening and closing prices. As markets are most active during the opening and closing of a trading session, it makes volatility estimation more accurate.
Garman and Klass also assumed that the process of price change follows a continuous diffusion process (Geometric Brownian motion). However, this assumption has several drawbacks. The method is not robust for opening jumps in price and trend movements.
Despite its drawbacks, the Garman-Klass estimator is still more effective than the basic formula since it takes into account not only the price at the beginning and end of the time interval but also intraday price extremes.
Researchers Rogers and Satchell have proposed a more efficient method for assessing historical volatility that takes into account price trends. See Rogers-Satchell Volatility for more detail.
Rogers-Satchell
Rogers-Satchell is an estimator for measuring the volatility of securities with an average return not equal to zero.
Unlike Parkinson and Garman-Klass estimators, Rogers-Satchell incorporates a drift term (mean return not equal to zero). As a result, it provides better volatility estimation when the underlying is trending.
The main disadvantage of this method is that it does not take into account price movements between trading sessions. This leads to an underestimation of volatility since price jumps periodically occur in the market precisely at the moments between sessions.
A more comprehensive estimator that also considers the gaps between sessions was developed based on the Rogers-Satchel formula in the 2000s by Yang-Zhang. See Yang Zhang Volatility for more detail.
Yang-Zhang
Yang Zhang is a historical volatility estimator that handles both opening jumps and the drift and has a minimum estimation error.
Yang-Zhang volatility can be thought of as a combination of the overnight (close-to-open volatility) and a weighted average of the Rogers-Satchell volatility and the day’s open-to-close volatility. It is considered to be 14 times more efficient than the close-to-close estimator.
Garman-Klass-Yang-Zhang
Garman-Klass-Yang-Zhang (GKYZ) volatility estimator incorporates the returns of open, high, low, and closing prices in its calculation.
GKYZ volatility estimator takes into account overnight jumps but not the trend, i.e., it assumes that the underlying asset follows a Geometric Brownian Motion (GBM) process with zero drift. Therefore, the GKYZ volatility estimator tends to overestimate the volatility when the drift is different from zero. However, for a GBM process, this estimator is eight times more efficient than the close-to-close volatility estimator.
Exponential Weighted Moving Average
The Exponentially Weighted Moving Average (EWMA) is a quantitative or statistical measure used to model or describe a time series. The EWMA is widely used in finance, with the main applications being technical analysis and volatility modeling.
The moving average is designed such that older observations are given lower weights. The weights decrease exponentially as the data point gets older – hence the name exponentially weighted.
The only decision a user of the EWMA must make is the parameter lambda. The parameter decides how important the current observation is in the calculation of the EWMA. The higher the value of lambda, the more closely the EWMA tracks the original time series.
Standard Deviation of Log Returns
This is the simplest calculation of volatility. It's the standard deviation of ln(close/close(1)).
Pseudo GARCH(2,2)
This is calculated using a short- and long-run mean of variance multiplied by ?.
?avg(var;M) + (1 ? ?) avg(var;N) = 2?var/(M+1-(M-1)L) + 2(1-?)var/(M+1-(M-1)L)
Solving for ? can be done by minimizing the mean squared error of estimation; that is, regressing L^-1var - avg(var; N) against avg(var; M) - avg(var; N) and using the resulting beta estimate as ?.
Average True Range
The average true range (ATR) is a technical analysis indicator, introduced by market technician J. Welles Wilder Jr. in his book New Concepts in Technical Trading Systems, that measures market volatility by decomposing the entire range of an asset price for that period.
The true range indicator is taken as the greatest of the following: current high less the current low; the absolute value of the current high less the previous close; and the absolute value of the current low less the previous close. The ATR is then a moving average, generally using 14 days, of the true ranges.
True Range Double
A special case of ATR that attempts to correct for volatility skew.
Standard Deviation
Standard deviation is a statistic that measures the dispersion of a dataset relative to its mean and is calculated as the square root of the variance. The standard deviation is calculated as the square root of variance by determining each data point's deviation relative to the mean. If the data points are further from the mean, there is a higher deviation within the data set; thus, the more spread out the data, the higher the standard deviation.
Adaptive Deviation
By definition, the Standard Deviation (STD, also represented by the Greek letter sigma ? or the Latin letter s) is a measure that is used to quantify the amount of variation or dispersion of a set of data values. In technical analysis, we usually use it to measure the level of current volatility.
Standard Deviation is based on Simple Moving Average calculation for mean value. This version of standard deviation uses the properties of EMA to calculate what can be called a new type of deviation, and since it is based on EMA, we can call it EMA deviation. Additionally, Perry Kaufman's efficiency ratio is used to make it adaptive (since all EMA type calculations are nearly perfect for adapting).
The difference when compared to the standard is significant--not just because of EMA usage, but the efficiency ratio makes it a "bit more logical" in very volatile market conditions.
Median Absolute Deviation
The median absolute deviation is a measure of statistical dispersion. Moreover, the MAD is a robust statistic, being more resilient to outliers in a data set than the standard deviation. In the standard deviation, the distances from the mean are squared, so large deviations are weighted more heavily, and thus outliers can heavily influence it. In the MAD, the deviations of a small number of outliers are irrelevant.
Because the MAD is a more robust estimator of scale than the sample variance or standard deviation, it works better with distributions without a mean or variance, such as the Cauchy distribution.
For this indicator, a manual recreation of the quantile function in Pine Script is used. This is so users have a full inside view into how this is calculated.
Efficiency-Ratio Adaptive ATR
Average True Range (ATR) is a widely used indicator for many occasions in technical analysis. It is calculated as the RMA of the true range. This version adds a "twist": it uses Perry Kaufman's Efficiency Ratio to calculate adaptive true range.
Mean Absolute Deviation
The mean absolute deviation (MAD) is a measure of variability that indicates the average distance between observations and their mean. MAD uses the original units of the data, which simplifies interpretation. Larger values signify that the data points spread out further from the average. Conversely, lower values correspond to data points bunching closer to it. The mean absolute deviation is also known as the mean deviation and average absolute deviation.
This definition of the mean absolute deviation sounds similar to the standard deviation (SD). While both measure variability, they have different calculations. In recent years, some proponents of MAD have suggested that it replace the SD as the primary measure because it is a simpler concept that better fits real life.
█ Giga Kaleidoscope Modularized Trading System
Core components of an NNFX algorithmic trading strategy
The NNFX algorithm is built on the principles of trend, momentum, and volatility. There are six core components in the NNFX trading algorithm:
1. Volatility - price volatility; e.g., Average True Range, True Range Double, Close-to-Close, etc.
2. Baseline - a moving average to identify price trend
3. Confirmation 1 - a technical indicator used to identify trends
4. Confirmation 2 - a technical indicator used to identify trends
5. Continuation - a technical indicator used to identify trends
6. Volatility/Volume - a technical indicator used to identify volatility/volume breakouts/breakdown
7. Exit - a technical indicator used to determine when a trend is exhausted
8. Metamorphosis - a technical indicator that produces a compound signal from the combination of other GKD indicators*
*(not part of the NNFX algorithm)
What is Volatility in the NNFX trading system?
In the NNFX (No Nonsense Forex) trading system, ATR (Average True Range) is typically used to measure the volatility of an asset. It is used as a part of the system to help determine the appropriate stop loss and take profit levels for a trade. ATR is calculated by taking the average of the true range values over a specified period.
True range is calculated as the maximum of the following values:
-Current high minus the current low
-Absolute value of the current high minus the previous close
-Absolute value of the current low minus the previous close
ATR is a dynamic indicator that changes with changes in volatility. As volatility increases, the value of ATR increases, and as volatility decreases, the value of ATR decreases. By using ATR in NNFX system, traders can adjust their stop loss and take profit levels according to the volatility of the asset being traded. This helps to ensure that the trade is given enough room to move, while also minimizing potential losses.
Other types of volatility include True Range Double (TRD), Close-to-Close, and Garman-Klass
What is a Baseline indicator?
The baseline is essentially a moving average, and is used to determine the overall direction of the market.
The baseline in the NNFX system is used to filter out trades that are not in line with the long-term trend of the market. The baseline is plotted on the chart along with other indicators, such as the Moving Average (MA), the Relative Strength Index (RSI), and the Average True Range (ATR).
Trades are only taken when the price is in the same direction as the baseline. For example, if the baseline is sloping upwards, only long trades are taken, and if the baseline is sloping downwards, only short trades are taken. This approach helps to ensure that trades are in line with the overall trend of the market, and reduces the risk of entering trades that are likely to fail.
By using a baseline in the NNFX system, traders can have a clear reference point for determining the overall trend of the market, and can make more informed trading decisions. The baseline helps to filter out noise and false signals, and ensures that trades are taken in the direction of the long-term trend.
What is a Confirmation indicator?
Confirmation indicators are technical indicators that are used to confirm the signals generated by primary indicators. Primary indicators are the core indicators used in the NNFX system, such as the Average True Range (ATR), the Moving Average (MA), and the Relative Strength Index (RSI).
The purpose of the confirmation indicators is to reduce false signals and improve the accuracy of the trading system. They are designed to confirm the signals generated by the primary indicators by providing additional information about the strength and direction of the trend.
Some examples of confirmation indicators that may be used in the NNFX system include the Bollinger Bands, the MACD (Moving Average Convergence Divergence), and the MACD Oscillator. These indicators can provide information about the volatility, momentum, and trend strength of the market, and can be used to confirm the signals generated by the primary indicators.
In the NNFX system, confirmation indicators are used in combination with primary indicators and other filters to create a trading system that is robust and reliable. By using multiple indicators to confirm trading signals, the system aims to reduce the risk of false signals and improve the overall profitability of the trades.
What is a Continuation indicator?
In the NNFX (No Nonsense Forex) trading system, a continuation indicator is a technical indicator that is used to confirm a current trend and predict that the trend is likely to continue in the same direction. A continuation indicator is typically used in conjunction with other indicators in the system, such as a baseline indicator, to provide a comprehensive trading strategy.
What is a Volatility/Volume indicator?
Volume indicators, such as the On Balance Volume (OBV), the Chaikin Money Flow (CMF), or the Volume Price Trend (VPT), are used to measure the amount of buying and selling activity in a market. They are based on the trading volume of the market, and can provide information about the strength of the trend. In the NNFX system, volume indicators are used to confirm trading signals generated by the Moving Average and the Relative Strength Index. Volatility indicators include Average Direction Index, Waddah Attar, and Volatility Ratio. In the NNFX trading system, volatility is a proxy for volume and vice versa.
By using volume indicators as confirmation tools, the NNFX trading system aims to reduce the risk of false signals and improve the overall profitability of trades. These indicators can provide additional information about the market that is not captured by the primary indicators, and can help traders to make more informed trading decisions. In addition, volume indicators can be used to identify potential changes in market trends and to confirm the strength of price movements.
What is an Exit indicator?
The exit indicator is used in conjunction with other indicators in the system, such as the Moving Average (MA), the Relative Strength Index (RSI), and the Average True Range (ATR), to provide a comprehensive trading strategy.
The exit indicator in the NNFX system can be any technical indicator that is deemed effective at identifying optimal exit points. Examples of exit indicators that are commonly used include the Parabolic SAR, the Average Directional Index (ADX), and the Chandelier Exit.
The purpose of the exit indicator is to identify when a trend is likely to reverse or when the market conditions have changed, signaling the need to exit a trade. By using an exit indicator, traders can manage their risk and prevent significant losses.
In the NNFX system, the exit indicator is used in conjunction with a stop loss and a take profit order to maximize profits and minimize losses. The stop loss order is used to limit the amount of loss that can be incurred if the trade goes against the trader, while the take profit order is used to lock in profits when the trade is moving in the trader's favor.
Overall, the use of an exit indicator in the NNFX trading system is an important component of a comprehensive trading strategy. It allows traders to manage their risk effectively and improve the profitability of their trades by exiting at the right time.
What is an Metamorphosis indicator?
The concept of a metamorphosis indicator involves the integration of two or more GKD indicators to generate a compound signal. This is achieved by evaluating the accuracy of each indicator and selecting the signal from the indicator with the highest accuracy. As an illustration, let's consider a scenario where we calculate the accuracy of 10 indicators and choose the signal from the indicator that demonstrates the highest accuracy.
The resulting output from the metamorphosis indicator can then be utilized in a GKD-BT backtest by occupying a slot that aligns with the purpose of the metamorphosis indicator. The slot can be a GKD-B, GKD-C, or GKD-E slot, depending on the specific requirements and objectives of the indicator. This allows for seamless integration and utilization of the compound signal within the GKD-BT framework.
How does Loxx's GKD (Giga Kaleidoscope Modularized Trading System) implement the NNFX algorithm outlined above?
Loxx's GKD v2.0 system has five types of modules (indicators/strategies). These modules are:
1. GKD-BT - Backtesting module (Volatility, Number 1 in the NNFX algorithm)
2. GKD-B - Baseline module (Baseline and Volatility/Volume, Numbers 1 and 2 in the NNFX algorithm)
3. GKD-C - Confirmation 1/2 and Continuation module (Confirmation 1/2 and Continuation, Numbers 3, 4, and 5 in the NNFX algorithm)
4. GKD-V - Volatility/Volume module (Confirmation 1/2, Number 6 in the NNFX algorithm)
5. GKD-E - Exit module (Exit, Number 7 in the NNFX algorithm)
6. GKD-M - Metamorphosis module (Metamorphosis, Number 8 in the NNFX algorithm, but not part of the NNFX algorithm)
(additional module types will added in future releases)
Each module interacts with every module by passing data to A backtest module wherein the various components of the GKD system are combined to create a trading signal.
That is, the Baseline indicator passes its data to Volatility/Volume. The Volatility/Volume indicator passes its values to the Confirmation 1 indicator. The Confirmation 1 indicator passes its values to the Confirmation 2 indicator. The Confirmation 2 indicator passes its values to the Continuation indicator. The Continuation indicator passes its values to the Exit indicator, and finally, the Exit indicator passes its values to the Backtest strategy.
This chaining of indicators requires that each module conform to Loxx's GKD protocol, therefore allowing for the testing of every possible combination of technical indicators that make up the six components of the NNFX algorithm.
What does the application of the GKD trading system look like?
Example trading system:
Backtest: Multi-Ticker SCC Backtest
Baseline: Hull Moving Average
Volatility/Volume: Hurst Exponent
Confirmation 1: Fisher Trasnform
Confirmation 2: uf2018
Continuation: Vortex
Exit: Rex Oscillator
Metamorphosis: Baseline Optimizer
Each GKD indicator is denoted with a module identifier of either: GKD-BT, GKD-B, GKD-C, GKD-V, GKD-M, or GKD-E. This allows traders to understand to which module each indicator belongs and where each indicator fits into the GKD system.
█ Giga Kaleidoscope Modularized Trading System Signals
Standard Entry
1. GKD-C Confirmation gives signal
2. Baseline agrees
3. Price inside Goldie Locks Zone Minimum
4. Price inside Goldie Locks Zone Maximum
5. Confirmation 2 agrees
6. Volatility/Volume agrees
1-Candle Standard Entry
1a. GKD-C Confirmation gives signal
2a. Baseline agrees
3a. Price inside Goldie Locks Zone Minimum
4a. Price inside Goldie Locks Zone Maximum
Next Candle
1b. Price retraced
2b. Baseline agrees
3b. Confirmation 1 agrees
4b. Confirmation 2 agrees
5b. Volatility/Volume agrees
Baseline Entry
1. GKD-B Basline gives signal
2. Confirmation 1 agrees
3. Price inside Goldie Locks Zone Minimum
4. Price inside Goldie Locks Zone Maximum
5. Confirmation 2 agrees
6. Volatility/Volume agrees
7. Confirmation 1 signal was less than 'Maximum Allowable PSBC Bars Back' prior
1-Candle Baseline Entry
1a. GKD-B Baseline gives signal
2a. Confirmation 1 agrees
3a. Price inside Goldie Locks Zone Minimum
4a. Price inside Goldie Locks Zone Maximum
5a. Confirmation 1 signal was less than 'Maximum Allowable PSBC Bars Back' prior
Next Candle
1b. Price retraced
2b. Baseline agrees
3b. Confirmation 1 agrees
4b. Confirmation 2 agrees
5b. Volatility/Volume agrees
Volatility/Volume Entry
1. GKD-V Volatility/Volume gives signal
2. Confirmation 1 agrees
3. Price inside Goldie Locks Zone Minimum
4. Price inside Goldie Locks Zone Maximum
5. Confirmation 2 agrees
6. Baseline agrees
7. Confirmation 1 signal was less than 7 candles prior
1-Candle Volatility/Volume Entry
1a. GKD-V Volatility/Volume gives signal
2a. Confirmation 1 agrees
3a. Price inside Goldie Locks Zone Minimum
4a. Price inside Goldie Locks Zone Maximum
5a. Confirmation 1 signal was less than 'Maximum Allowable PSVVC Bars Back' prior
Next Candle
1b. Price retraced
2b. Volatility/Volume agrees
3b. Confirmation 1 agrees
4b. Confirmation 2 agrees
5b. Baseline agrees
Confirmation 2 Entry
1. GKD-C Confirmation 2 gives signal
2. Confirmation 1 agrees
3. Price inside Goldie Locks Zone Minimum
4. Price inside Goldie Locks Zone Maximum
5. Volatility/Volume agrees
6. Baseline agrees
7. Confirmation 1 signal was less than 7 candles prior
1-Candle Confirmation 2 Entry
1a. GKD-C Confirmation 2 gives signal
2a. Confirmation 1 agrees
3a. Price inside Goldie Locks Zone Minimum
4a. Price inside Goldie Locks Zone Maximum
5a. Confirmation 1 signal was less than 'Maximum Allowable PSC2C Bars Back' prior
Next Candle
1b. Price retraced
2b. Confirmation 2 agrees
3b. Confirmation 1 agrees
4b. Volatility/Volume agrees
5b. Baseline agrees
PullBack Entry
1a. GKD-B Baseline gives signal
2a. Confirmation 1 agrees
3a. Price is beyond 1.0x Volatility of Baseline
Next Candle
1b. Price inside Goldie Locks Zone Minimum
2b. Price inside Goldie Locks Zone Maximum
3b. Confirmation 1 agrees
4b. Confirmation 2 agrees
5b. Volatility/Volume agrees
Continuation Entry
1. Standard Entry, 1-Candle Standard Entry, Baseline Entry, 1-Candle Baseline Entry, Volatility/Volume Entry, 1-Candle Volatility/Volume Entry, Confirmation 2 Entry, 1-Candle Confirmation 2 Entry, or Pullback entry triggered previously
2. Baseline hasn't crossed since entry signal trigger
4. Confirmation 1 agrees
5. Baseline agrees
6. Confirmation 2 agrees
█ Connecting to Backtests
All GKD indicators are chained indicators meaning you export the value of the indicators to specialized backtest to creat your GKD trading system. Each indicator contains a proprietary signal generation algo that will only work with GKD backtests. You can find these backtests using the links below.
GKD-BT Giga Confirmation Stack Backtest
GKD-BT Giga Stacks Backtest
GKD-BT Full Giga Kaleidoscope Backtest
GKD-BT Solo Confirmation Super Complex Backtest
GKD-BT Solo Confirmation Complex Backtest
GKD-BT Solo Confirmation Simple Backtest
GKD-M Baseline Optimizer
GKD-M Accuracy Alchemist
GKD-B Stepped Baseline [Loxx]Giga Kaleidoscope GKD-B Stepped Baseline is a Baseline module included in Loxx's "Giga Kaleidoscope Modularized Trading System".
█ GKD-B Stepped Baseline
This is a special implementation of GKD-B Baseline in that it allows the user to filter the selected moving average using the various types of volatility listed below. This additional filter allows the trader to identify longer trends that may be more confucive to a slow and steady trading style.
GKD Stepped Baseline includes 64 different moving averages:
Adaptive Moving Average - AMA
ADXvma - Average Directional Volatility Moving Average
Ahrens Moving Average
Alexander Moving Average - ALXMA
Deviation Scaled Moving Average - DSMA
Donchian
Double Exponential Moving Average - DEMA
Double Smoothed Exponential Moving Average - DSEMA
Double Smoothed FEMA - DSFEMA
Double Smoothed Range Weighted EMA - DSRWEMA
Double Smoothed Wilders EMA - DSWEMA
Double Weighted Moving Average - DWMA
Ehlers Optimal Tracking Filter - EOTF
Exponential Moving Average - EMA
Fast Exponential Moving Average - FEMA
Fractal Adaptive Moving Average - FRAMA
Generalized DEMA - GDEMA
Generalized Double DEMA - GDDEMA
Hull Moving Average (Type 1) - HMA1
Hull Moving Average (Type 2) - HMA2
Hull Moving Average (Type 3) - HMA3
Hull Moving Average (Type 4) - HMA4
IE /2 - Early T3 by Tim Tilson
Integral of Linear Regression Slope - ILRS
Instantaneous Trendline
Kalman Filter
Kaufman Adaptive Moving Average - KAMA
Laguerre Filter
Leader Exponential Moving Average
Linear Regression Value - LSMA ( Least Squares Moving Average )
Linear Weighted Moving Average - LWMA
McGinley Dynamic
McNicholl EMA
Non-Lag Moving Average
Ocean NMA Moving Average - ONMAMA
One More Moving Average - OMA
Parabolic Weighted Moving Average
Probability Density Function Moving Average - PDFMA
Quadratic Regression Moving Average - QRMA
Regularized EMA - REMA
Range Weighted EMA - RWEMA
Recursive Moving Trendline
Simple Decycler - SDEC
Simple Jurik Moving Average - SJMA
Simple Moving Average - SMA
Sine Weighted Moving Average
Smoothed LWMA - SLWMA
Smoothed Moving Average - SMMA
Smoother
Super Smoother
T3
Three-pole Ehlers Butterworth
Three-pole Ehlers Smoother
Triangular Moving Average - TMA
Triple Exponential Moving Average - TEMA
Two-pole Ehlers Butterworth
Two-pole Ehlers smoother
Variable Index Dynamic Average - VIDYA
Variable Moving Average - VMA
Volume Weighted EMA - VEMA
Volume Weighted Moving Average - VWMA
Zero-Lag DEMA - Zero Lag Exponential Moving Average
Zero-Lag Moving Average
Zero Lag TEMA - Zero Lag Triple Exponential Moving Average
Adaptive Moving Average - AMA
The Adaptive Moving Average (AMA) is a moving average that changes its sensitivity to price moves depending on the calculated volatility. It becomes more sensitive during periods when the price is moving smoothly in a certain direction and becomes less sensitive when the price is volatile.
ADXvma - Average Directional Volatility Moving Average
Linnsoft's ADXvma formula is a volatility-based moving average, with the volatility being determined by the value of the ADX indicator.
The ADXvma has the SMA in Chande's CMO replaced with an EMA , it then uses a few more layers of EMA smoothing before the "Volatility Index" is calculated.
A side effect is, those additional layers slow down the ADXvma when you compare it to Chande's Variable Index Dynamic Average VIDYA .
The ADXVMA provides support during uptrends and resistance during downtrends and will stay flat for longer, but will create some of the most accurate market signals when it decides to move.
Ahrens Moving Average
Richard D. Ahrens's Moving Average promises "Smoother Data" that isn't influenced by the occasional price spike. It works by using the Open and the Close in his formula so that the only time the Ahrens Moving Average will change is when the candlestick is either making new highs or new lows.
Alexander Moving Average - ALXMA
This Moving Average uses an elaborate smoothing formula and utilizes a 7 period Moving Average. It corresponds to fitting a second-order polynomial to seven consecutive observations. This moving average is rarely used in trading but is interesting as this Moving Average has been applied to diffusion indexes that tend to be very volatile.
Deviation Scaled Moving Average - DSMA
The Deviation-Scaled Moving Average is a data smoothing technique that acts like an exponential moving average with a dynamic smoothing coefficient. The smoothing coefficient is automatically updated based on the magnitude of price changes. In the Deviation-Scaled Moving Average, the standard deviation from the mean is chosen to be the measure of this magnitude. The resulting indicator provides substantial smoothing of the data even when price changes are small while quickly adapting to these changes.
Donchian
Donchian Channels are three lines generated by moving average calculations that comprise an indicator formed by upper and lower bands around a midrange or median band. The upper band marks the highest price of a security over N periods while the lower band marks the lowest price of a security over N periods.
Double Exponential Moving Average - DEMA
The Double Exponential Moving Average ( DEMA ) combines a smoothed EMA and a single EMA to provide a low-lag indicator. It's primary purpose is to reduce the amount of "lagging entry" opportunities, and like all Moving Averages, the DEMA confirms uptrends whenever price crosses on top of it and closes above it, and confirms downtrends when the price crosses under it and closes below it - but with significantly less lag.
Double Smoothed Exponential Moving Average - DSEMA
The Double Smoothed Exponential Moving Average is a lot less laggy compared to a traditional EMA . It's also considered a leading indicator compared to the EMA , and is best utilized whenever smoothness and speed of reaction to market changes are required.
Double Smoothed FEMA - DSFEMA
Same as the Double Exponential Moving Average (DEMA), but uses a faster version of EMA for its calculation.
Double Smoothed Range Weighted EMA - DSRWEMA
Range weighted exponential moving average (EMA) is, unlike the "regular" range weighted average calculated in a different way. Even though the basis - the range weighting - is the same, the way how it is calculated is completely different. By definition this type of EMA is calculated as a ratio of EMA of price*weight / EMA of weight. And the results are very different and the two should be considered as completely different types of averages. The higher than EMA to price changes responsiveness when the ranges increase remains in this EMA too and in those cases this EMA is clearly leading the "regular" EMA. This version includes double smoothing.
Double Smoothed Wilders EMA - DSWEMA
Welles Wilder was frequently using one "special" case of EMA (Exponential Moving Average) that is due to that fact (that he used it) sometimes called Wilder's EMA. This version is adding double smoothing to Wilder's EMA in order to make it "faster" (it is more responsive to market prices than the original) and is still keeping very smooth values.
Double Weighted Moving Average - DWMA
Double weighted moving average is an LWMA (Linear Weighted Moving Average). Instead of doing one cycle for calculating the LWMA, the indicator is made to cycle the loop 2 times. That produces a smoother values than the original LWMA
Ehlers Optimal Tracking Filter - EOTF
The Elher's Optimum Tracking Filter quickly adjusts rapid shifts in the price and yet is relatively smooth when the price has a sideways action. The operation of this filter is similar to Kaufman’s Adaptive Moving
Average
Exponential Moving Average - EMA
The EMA places more significance on recent data points and moves closer to price than the SMA ( Simple Moving Average ). It reacts faster to volatility due to its emphasis on recent data and is known for its ability to give greater weight to recent and more relevant data. The EMA is therefore seen as an enhancement over the SMA .
Fast Exponential Moving Average - FEMA
An Exponential Moving Average with a short look-back period.
Fractal Adaptive Moving Average - FRAMA
The Fractal Adaptive Moving Average by John Ehlers is an intelligent adaptive Moving Average which takes the importance of price changes into account and follows price closely enough to display significant moves whilst remaining flat if price ranges. The FRAMA does this by dynamically adjusting the look-back period based on the market's fractal geometry.
Generalized DEMA - GDEMA
The double exponential moving average (DEMA), was developed by Patrick Mulloy in an attempt to reduce the amount of lag time found in traditional moving averages. It was first introduced in the February 1994 issue of the magazine Technical Analysis of Stocks & Commodities in Mulloy's article "Smoothing Data with Faster Moving Averages.". Instead of using fixed multiplication factor in the final DEMA formula, the generalized version allows you to change it. By varying the "volume factor" form 0 to 1 you apply different multiplications and thus producing DEMA with different "speed" - the higher the volume factor is the "faster" the DEMA will be (but also the slope of it will be less smooth). The volume factor is limited in the calculation to 1 since any volume factor that is larger than 1 is increasing the overshooting to the extent that some volume factors usage makes the indicator unusable.
Generalized Double DEMA - GDDEMA
The double exponential moving average (DEMA), was developed by Patrick Mulloy in an attempt to reduce the amount of lag time found in traditional moving averages. It was first introduced in the February 1994 issue of the magazine Technical Analysis of Stocks & Commodities in Mulloy's article "Smoothing Data with Faster Moving Averages''. This is an extension of the Generalized DEMA using Tim Tillsons (the inventor of T3) idea, and is using GDEMA of GDEMA for calculation (which is the "middle step" of T3 calculation). Since there are no versions showing that middle step, this version covers that too. The result is smoother than Generalized DEMA, but is less smooth than T3 - one has to do some experimenting in order to find the optimal way to use it, but in any case, since it is "faster" than the T3 (Tim Tillson T3) and still smooth, it looks like a good compromise between speed and smoothness.
Hull Moving Average (Type 1) - HMA1
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses SMA for smoothing.
Hull Moving Average (Type 2) - HMA2
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses EMA for smoothing.
Hull Moving Average (Type 3) - HMA3
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses LWMA for smoothing.
Hull Moving Average (Type 4) - HMA4
Alan Hull's HMA makes use of weighted moving averages to prioritize recent values and greatly reduce lag whilst maintaining the smoothness of a traditional Moving Average. For this reason, it's seen as a well-suited Moving Average for identifying entry points. This version uses SMMA for smoothing.
IE /2 - Early T3 by Tim Tilson and T3 new
The T3 moving average is a type of technical indicator used in financial analysis to identify trends in price movements. It is similar to the Exponential Moving Average (EMA) and the Double Exponential Moving Average (DEMA), but uses a different smoothing algorithm.
The T3 moving average is calculated using a series of exponential moving averages that are designed to filter out noise and smooth the data. The resulting smoothed data is then weighted with a non-linear function to produce a final output that is more responsive to changes in trend direction.
The T3 moving average can be customized by adjusting the length of the moving average, as well as the weighting function used to smooth the data. It is commonly used in conjunction with other technical indicators as part of a larger trading strategy.
Integral of Linear Regression Slope - ILRS
A Moving Average where the slope of a linear regression line is simply integrated as it is fitted in a moving window of length N (natural numbers in maths) across the data. The derivative of ILRS is the linear regression slope. ILRS is not the same as a SMA ( Simple Moving Average ) of length N, which is actually the midpoint of the linear regression line as it moves across the data.
Instantaneous Trendline
The Instantaneous Trendline is created by removing the dominant cycle component from the price information which makes this Moving Average suitable for medium to long-term trading.
Kalman Filter
Kalman filter is an algorithm that uses a series of measurements observed over time, containing statistical noise and other inaccuracies. This means that the filter was originally designed to work with noisy data. Also, it is able to work with incomplete data. Another advantage is that it is designed for and applied in dynamic systems; our price chart belongs to such systems. This version is true to the original design of the trade-ready Kalman Filter where velocity is the triggering mechanism.
Kalman Filter is a more accurate smoothing/prediction algorithm than the moving average because it is adaptive: it accounts for estimation errors and tries to adjust its predictions from the information it learned in the previous stage. Theoretically, Kalman Filter consists of measurement and transition components.
Kaufman Adaptive Moving Average - KAMA
Developed by Perry Kaufman, Kaufman's Adaptive Moving Average (KAMA) is a moving average designed to account for market noise or volatility. KAMA will closely follow prices when the price swings are relatively small and the noise is low.
Laguerre Filter
The Laguerre Filter is a smoothing filter which is based on Laguerre polynomials. The filter requires the current price, three prior prices, a user defined factor called Alpha to fill its calculation.
Adjusting the Alpha coefficient is used to increase or decrease its lag and its smoothness.
Leader Exponential Moving Average
The Leader EMA was created by Giorgos E. Siligardos who created a Moving Average which was able to eliminate lag altogether whilst maintaining some smoothness. It was first described during his research paper "MACD Leader" where he applied this to the MACD to improve its signals and remove its lagging issue. This filter uses his leading MACD's "modified EMA" and can be used as a zero lag filter.
Linear Regression Value - LSMA ( Least Squares Moving Average )
LSMA as a Moving Average is based on plotting the end point of the linear regression line. It compares the current value to the prior value and a determination is made of a possible trend, eg. the linear regression line is pointing up or down.
Linear Weighted Moving Average - LWMA
LWMA reacts to price quicker than the SMA and EMA . Although it's similar to the Simple Moving Average , the difference is that a weight coefficient is multiplied to the price which means the most recent price has the highest weighting, and each prior price has progressively less weight. The weights drop in a linear fashion.
McGinley Dynamic
John McGinley created this Moving Average to track prices better than traditional Moving Averages. It does this by incorporating an automatic adjustment factor into its formula, which speeds (or slows) the indicator in trending, or ranging, markets.
McNicholl EMA
Dennis McNicholl developed this Moving Average to use as his center line for his "Better Bollinger Bands" indicator and was successful because it responded better to volatility changes over the standard SMA and managed to avoid common whipsaws.
Non-lag moving average
The Non Lag Moving average follows price closely and gives very quick signals as well as early signals of price change. As a standalone Moving Average, it should not be used on its own, but as an additional confluence tool for early signals.
Ocean NMA Moving Average - ONMAMA
Created by Jim Sloman, the NMA is a moving average that automatically adjusts to volatility without being programmed to do so. For more info, read his guide "Ocean Theory, an Introduction"
One More Moving Average (OMA)
The One More Moving Average (OMA) is a technical indicator that calculates a series of Jurik-style moving averages in order to reduce noise and provide smoother price data. It uses six exponential moving averages to generate the final value, with the length of the moving averages determined by an adaptive algorithm that adjusts to the current market conditions. The algorithm calculates the average period by comparing the signal to noise ratio and using this value to determine the length of the moving averages. The resulting values are used to generate the final value of the OMA, which can be used to identify trends and potential changes in trend direction.
Parabolic Weighted Moving Average
The Parabolic Weighted Moving Average is a variation of the Linear Weighted Moving Average . The Linear Weighted Moving Average calculates the average by assigning different weights to each element in its calculation. The Parabolic Weighted Moving Average is a variation that allows weights to be changed to form a parabolic curve. It is done simply by using the Power parameter of this indicator.
Probability Density Function Moving Average - PDFMA
Probability density function based MA is a sort of weighted moving average that uses probability density function to calculate the weights. By its nature it is similar to a lot of digital filters.
Quadratic Regression Moving Average - QRMA
A quadratic regression is the process of finding the equation of the parabola that best fits a set of data. This moving average is an obscure concept that was posted to Forex forums in around 2008.
Regularized EMA - REMA
The regularized exponential moving average (REMA) by Chris Satchwell is a variation on the EMA (see Exponential Moving Average) designed to be smoother but not introduce too much extra lag.
Range Weighted EMA - RWEMA
This indicator is a variation of the range weighted EMA. The variation comes from a possible need to make that indicator a bit less "noisy" when it comes to slope changes. The method used for calculating this variation is the method described by Lee Leibfarth in his article "Trading With An Adaptive Price Zone".
Recursive Moving Trendline
Dennis Meyers's Recursive Moving Trendline uses a recursive (repeated application of a rule) polynomial fit, a technique that uses a small number of past values estimations of price and today's price to predict tomorrow's price.
Simple Decycler - SDEC
The Ehlers Simple Decycler study is a virtually zero-lag technical indicator proposed by John F. Ehlers. The original idea behind this study (and several others created by John F. Ehlers) is that market data can be considered a continuum of cycle periods with different cycle amplitudes. Thus, trending periods can be considered segments of longer cycles, or, in other words, low-frequency segments. Applying the right filter might help identify these segments.
Simple Loxx Moving Average - SLMA
A three stage moving average combining an adaptive EMA, a Kalman Filter, and a Kauffman adaptive filter.
Simple Moving Average - SMA
The SMA calculates the average of a range of prices by adding recent prices and then dividing that figure by the number of time periods in the calculation average. It is the most basic Moving Average which is seen as a reliable tool for starting off with Moving Average studies. As reliable as it may be, the basic moving average will work better when it's enhanced into an EMA .
Sine Weighted Moving Average
The Sine Weighted Moving Average assigns the most weight at the middle of the data set. It does this by weighting from the first half of a Sine Wave Cycle and the most weighting is given to the data in the middle of that data set. The Sine WMA closely resembles the TMA (Triangular Moving Average).
Smoothed LWMA - SLWMA
A smoothed version of the LWMA
Smoothed Moving Average - SMMA
The Smoothed Moving Average is similar to the Simple Moving Average ( SMA ), but aims to reduce noise rather than reduce lag. SMMA takes all prices into account and uses a long lookback period. Due to this, it's seen as an accurate yet laggy Moving Average.
Smoother
The Smoother filter is a faster-reacting smoothing technique which generates considerably less lag than the SMMA ( Smoothed Moving Average ). It gives earlier signals but can also create false signals due to its earlier reactions. This filter is sometimes wrongly mistaken for the superior Jurik Smoothing algorithm.
Super Smoother
The Super Smoother filter uses John Ehlers’s “Super Smoother” which consists of a Two pole Butterworth filter combined with a 2-bar SMA ( Simple Moving Average ) that suppresses the 22050 Hz Nyquist frequency: A characteristic of a sampler, which converts a continuous function or signal into a discrete sequence.
Three-pole Ehlers Butterworth
The 3 pole Ehlers Butterworth (as well as the Two pole Butterworth) are both superior alternatives to the EMA and SMA . They aim at producing less lag whilst maintaining accuracy. The 2 pole filter will give you a better approximation for price, whereas the 3 pole filter has superior smoothing.
Three-pole Ehlers smoother
The 3 pole Ehlers smoother works almost as close to price as the above mentioned 3 Pole Ehlers Butterworth. It acts as a strong baseline for signals but removes some noise. Side by side, it hardly differs from the Three Pole Ehlers Butterworth but when examined closely, it has better overshoot reduction compared to the 3 pole Ehlers Butterworth.
Triangular Moving Average - TMA
The TMA is similar to the EMA but uses a different weighting scheme. Exponential and weighted Moving Averages will assign weight to the most recent price data. Simple moving averages will assign the weight equally across all the price data. With a TMA (Triangular Moving Average), it is double smoother (averaged twice) so the majority of the weight is assigned to the middle portion of the data.
Triple Exponential Moving Average - TEMA
The TEMA uses multiple EMA calculations as well as subtracting lag to create a tool which can be used for scalping pullbacks. As it follows price closely, its signals are considered very noisy and should only be used in extremely fast-paced trading conditions.
Two-pole Ehlers Butterworth
The 2 pole Ehlers Butterworth (as well as the three pole Butterworth mentioned above) is another filter that cuts out the noise and follows the price closely. The 2 pole is seen as a faster, leading filter over the 3 pole and follows price a bit more closely. Analysts will utilize both a 2 pole and a 3 pole Butterworth on the same chart using the same period, but having both on chart allows its crosses to be traded.
Two-pole Ehlers smoother
A smoother version of the Two pole Ehlers Butterworth. This filter is the faster version out of the 3 pole Ehlers Butterworth. It does a decent job at cutting out market noise whilst emphasizing a closer following to price over the 3 pole Ehlers .
Variable Index Dynamic Average - VIDYA
Variable Index Dynamic Average Technical Indicator ( VIDYA ) was developed by Tushar Chande. It is an original method of calculating the Exponential Moving Average ( EMA ) with the dynamically changing period of averaging.
Variable Moving Average - VMA
The Variable Moving Average (VMA) is a study that uses an Exponential Moving Average being able to automatically adjust its smoothing factor according to the market volatility.
Volume Weighted EMA - VEMA
An EMA that uses a volume and price weighted calculation instead of the standard price input.
Volume Weighted Moving Average - VWMA
A Volume Weighted Moving Average is a moving average where more weight is given to bars with heavy volume than with light volume. Thus the value of the moving average will be closer to where most trading actually happened than it otherwise would be without being volume weighted.
Zero-Lag DEMA - Zero Lag Double Exponential Moving Average
John Ehlers's Zero Lag DEMA's aim is to eliminate the inherent lag associated with all trend following indicators which average a price over time. Because this is a Double Exponential Moving Average with Zero Lag, it has a tendency to overshoot and create a lot of false signals for swing trading. It can however be used for quick scalping or as a secondary indicator for confluence.
Zero-Lag Moving Average
The Zero Lag Moving Average is described by its creator, John Ehlers , as a Moving Average with absolutely no delay. And it's for this reason that this filter will cause a lot of abrupt signals which will not be ideal for medium to long-term traders. This filter is designed to follow price as close as possible whilst de-lagging data instead of basing it on regular data. The way this is done is by attempting to remove the cumulative effect of the Moving Average.
Zero-Lag TEMA - Zero Lag Triple Exponential Moving Average
Just like the Zero Lag DEMA , this filter will give you the fastest signals out of all the Zero Lag Moving Averages. This is useful for scalping but dangerous for medium to long-term traders, especially during market Volatility and news events. Having no lag, this filter also has no smoothing in its signals and can cause some very bizarre behavior when applied to certain indicators.
Volatility Goldie Locks Zone
This volatility filter is the standard first pass filter that is used for all NNFX systems despite the additional volatility/volume filter used in step 5. For this filter, price must fall into a range of maximum and minimum values calculated using multiples of volatility. Unlike the standard NNFX systems, this version of volatility filtering is separated from the core Baseline and uses it's own moving average with Loxx's Exotic Source Types. The green and red dots at the top of the chart denote whether a candle qualifies for a either or long or short respectively. The green and red triangles at the bottom of the chart denote whether the trigger has crossed up or down and qualifies inside the Goldie Locks zone. White coloring of the Goldie Locks Zone mean line is where volatility is too low to trade.
Volatility Types Included
Close-to-Close
Close-to-Close volatility is a classic and most commonly used volatility measure, sometimes referred to as historical volatility .
Volatility is an indicator of the speed of a stock price change. A stock with high volatility is one where the price changes rapidly and with a bigger amplitude. The more volatile a stock is, the riskier it is.
Close-to-close historical volatility calculated using only stock's closing prices. It is the simplest volatility estimator. But in many cases, it is not precise enough. Stock prices could jump considerably during a trading session, and return to the open value at the end. That means that a big amount of price information is not taken into account by close-to-close volatility .
Despite its drawbacks, Close-to-Close volatility is still useful in cases where the instrument doesn't have intraday prices. For example, mutual funds calculate their net asset values daily or weekly, and thus their prices are not suitable for more sophisticated volatility estimators.
Parkinson
Parkinson volatility is a volatility measure that uses the stock’s high and low price of the day.
The main difference between regular volatility and Parkinson volatility is that the latter uses high and low prices for a day, rather than only the closing price. That is useful as close to close prices could show little difference while large price movements could have happened during the day. Thus Parkinson's volatility is considered to be more precise and requires less data for calculation than the close-close volatility .
One drawback of this estimator is that it doesn't take into account price movements after market close. Hence it systematically undervalues volatility . That drawback is taken into account in the Garman-Klass's volatility estimator.
Garman-Klass
Garman Klass is a volatility estimator that incorporates open, low, high, and close prices of a security.
Garman-Klass volatility extends Parkinson's volatility by taking into account the opening and closing price. As markets are most active during the opening and closing of a trading session, it makes volatility estimation more accurate.
Garman and Klass also assumed that the process of price change is a process of continuous diffusion (Geometric Brownian motion). However, this assumption has several drawbacks. The method is not robust for opening jumps in price and trend movements.
Despite its drawbacks, the Garman-Klass estimator is still more effective than the basic formula since it takes into account not only the price at the beginning and end of the time interval but also intraday price extremums.
Researchers Rogers and Satchel have proposed a more efficient method for assessing historical volatility that takes into account price trends. See Rogers-Satchell Volatility for more detail.
Rogers-Satchell
Rogers-Satchell is an estimator for measuring the volatility of securities with an average return not equal to zero.
Unlike Parkinson and Garman-Klass estimators, Rogers-Satchell incorporates drift term (mean return not equal to zero). As a result, it provides a better volatility estimation when the underlying is trending.
The main disadvantage of this method is that it does not take into account price movements between trading sessions. It means an underestimation of volatility since price jumps periodically occur in the market precisely at the moments between sessions.
A more comprehensive estimator that also considers the gaps between sessions was developed based on the Rogers-Satchel formula in the 2000s by Yang-Zhang. See Yang Zhang Volatility for more detail.
Yang-Zhang
Yang Zhang is a historical volatility estimator that handles both opening jumps and the drift and has a minimum estimation error.
We can think of the Yang-Zhang volatility as the combination of the overnight (close-to-open volatility ) and a weighted average of the Rogers-Satchell volatility and the day’s open-to-close volatility . It considered being 14 times more efficient than the close-to-close estimator.
Garman-Klass-Yang-Zhang
Garman-Klass-Yang-Zhang (GKYZ) volatility estimator consists of using the returns of open, high, low, and closing prices in its calculation.
GKYZ volatility estimator takes into account overnight jumps but not the trend, i.e. it assumes that the underlying asset follows a GBM process with zero drift. Therefore the GKYZ volatility estimator tends to overestimate the volatility when the drift is different from zero. However, for a GBM process, this estimator is eight times more efficient than the close-to-close volatility estimator.
Exponential Weighted Moving Average
The Exponentially Weighted Moving Average (EWMA) is a quantitative or statistical measure used to model or describe a time series. The EWMA is widely used in finance, the main applications being technical analysis and volatility modeling.
The moving average is designed as such that older observations are given lower weights. The weights fall exponentially as the data point gets older – hence the name exponentially weighted.
The only decision a user of the EWMA must make is the parameter lambda. The parameter decides how important the current observation is in the calculation of the EWMA. The higher the value of lambda, the more closely the EWMA tracks the original time series.
Standard Deviation of Log Returns
This is the simplest calculation of volatility . It's the standard deviation of ln(close/close(1))
Pseudo GARCH(2,2)
This is calculated using a short- and long-run mean of variance multiplied by θ.
θavg(var ;M) + (1 − θ) avg (var ;N) = 2θvar/(M+1-(M-1)L) + 2(1-θ)var/(M+1-(M-1)L)
Solving for θ can be done by minimizing the mean squared error of estimation; that is, regressing L^-1var - avg (var; N) against avg (var; M) - avg (var; N) and using the resulting beta estimate as θ.
Average True Range
The average true range (ATR) is a technical analysis indicator, introduced by market technician J. Welles Wilder Jr. in his book New Concepts in Technical Trading Systems, that measures market volatility by decomposing the entire range of an asset price for that period.
The true range indicator is taken as the greatest of the following: current high less the current low; the absolute value of the current high less the previous close; and the absolute value of the current low less the previous close. The ATR is then a moving average, generally using 14 days, of the true ranges.
True Range Double
A special case of ATR that attempts to correct for volatility skew.
Standard Deviation
Standard deviation is a statistic that measures the dispersion of a dataset relative to its mean and is calculated as the square root of the variance. The standard deviation is calculated as the square root of variance by determining each data point's deviation relative to the mean. If the data points are further from the mean, there is a higher deviation within the data set; thus, the more spread out the data, the higher the standard deviation.
Adaptive Deviation
By definition, the Standard Deviation (STD, also represented by the Greek letter sigma σ or the Latin letter s) is a measure that is used to quantify the amount of variation or dispersion of a set of data values. In technical analysis we usually use it to measure the level of current volatility .
Standard Deviation is based on Simple Moving Average calculation for mean value. This version of standard deviation uses the properties of EMA to calculate what can be called a new type of deviation, and since it is based on EMA , we can call it EMA deviation. And added to that, Perry Kaufman's efficiency ratio is used to make it adaptive (since all EMA type calculations are nearly perfect for adapting).
The difference when compared to standard is significant--not just because of EMA usage, but the efficiency ratio makes it a "bit more logical" in very volatile market conditions.
Median Absolute Deviation
The median absolute deviation is a measure of statistical dispersion. Moreover, the MAD is a robust statistic, being more resilient to outliers in a data set than the standard deviation. In the standard deviation, the distances from the mean are squared, so large deviations are weighted more heavily, and thus outliers can heavily influence it. In the MAD, the deviations of a small number of outliers are irrelevant.
Because the MAD is a more robust estimator of scale than the sample variance or standard deviation, it works better with distributions without a mean or variance, such as the Cauchy distribution.
For this indicator, I used a manual recreation of the quantile function in Pine Script. This is so users have a full inside view into how this is calculated.
Efficiency-Ratio Adaptive ATR
Average True Range (ATR) is widely used indicator in many occasions for technical analysis . It is calculated as the RMA of true range. This version adds a "twist": it uses Perry Kaufman's Efficiency Ratio to calculate adaptive true range
Mean Absolute Deviation
The mean absolute deviation (MAD) is a measure of variability that indicates the average distance between observations and their mean. MAD uses the original units of the data, which simplifies interpretation. Larger values signify that the data points spread out further from the average. Conversely, lower values correspond to data points bunching closer to it. The mean absolute deviation is also known as the mean deviation and average absolute deviation.
This definition of the mean absolute deviation sounds similar to the standard deviation ( SD ). While both measure variability, they have different calculations. In recent years, some proponents of MAD have suggested that it replace the SD as the primary measure because it is a simpler concept that better fits real life.
For Pine Coders, this is equivalent of using ta.dev()
Additional features will be added in future releases.
█ Giga Kaleidoscope Modularized Trading System
Core components of an NNFX algorithmic trading strategy
The NNFX algorithm is built on the principles of trend, momentum, and volatility. There are six core components in the NNFX trading algorithm:
1. Volatility - price volatility; e.g., Average True Range, True Range Double, Close-to-Close, etc.
2. Baseline - a moving average to identify price trend
3. Confirmation 1 - a technical indicator used to identify trends
4. Confirmation 2 - a technical indicator used to identify trends
5. Continuation - a technical indicator used to identify trends
6. Volatility/Volume - a technical indicator used to identify volatility/volume breakouts/breakdown
7. Exit - a technical indicator used to determine when a trend is exhausted
What is Volatility in the NNFX trading system?
In the NNFX (No Nonsense Forex) trading system, ATR (Average True Range) is typically used to measure the volatility of an asset. It is used as a part of the system to help determine the appropriate stop loss and take profit levels for a trade. ATR is calculated by taking the average of the true range values over a specified period.
True range is calculated as the maximum of the following values:
-Current high minus the current low
-Absolute value of the current high minus the previous close
-Absolute value of the current low minus the previous close
ATR is a dynamic indicator that changes with changes in volatility. As volatility increases, the value of ATR increases, and as volatility decreases, the value of ATR decreases. By using ATR in NNFX system, traders can adjust their stop loss and take profit levels according to the volatility of the asset being traded. This helps to ensure that the trade is given enough room to move, while also minimizing potential losses.
Other types of volatility include True Range Double (TRD), Close-to-Close, and Garman-Klass
What is a Baseline indicator?
The baseline is essentially a moving average, and is used to determine the overall direction of the market.
The baseline in the NNFX system is used to filter out trades that are not in line with the long-term trend of the market. The baseline is plotted on the chart along with other indicators, such as the Moving Average (MA), the Relative Strength Index (RSI), and the Average True Range (ATR).
Trades are only taken when the price is in the same direction as the baseline. For example, if the baseline is sloping upwards, only long trades are taken, and if the baseline is sloping downwards, only short trades are taken. This approach helps to ensure that trades are in line with the overall trend of the market, and reduces the risk of entering trades that are likely to fail.
By using a baseline in the NNFX system, traders can have a clear reference point for determining the overall trend of the market, and can make more informed trading decisions. The baseline helps to filter out noise and false signals, and ensures that trades are taken in the direction of the long-term trend.
What is a Confirmation indicator?
Confirmation indicators are technical indicators that are used to confirm the signals generated by primary indicators. Primary indicators are the core indicators used in the NNFX system, such as the Average True Range (ATR), the Moving Average (MA), and the Relative Strength Index (RSI).
The purpose of the confirmation indicators is to reduce false signals and improve the accuracy of the trading system. They are designed to confirm the signals generated by the primary indicators by providing additional information about the strength and direction of the trend.
Some examples of confirmation indicators that may be used in the NNFX system include the Bollinger Bands, the MACD (Moving Average Convergence Divergence), and the MACD Oscillator. These indicators can provide information about the volatility, momentum, and trend strength of the market, and can be used to confirm the signals generated by the primary indicators.
In the NNFX system, confirmation indicators are used in combination with primary indicators and other filters to create a trading system that is robust and reliable. By using multiple indicators to confirm trading signals, the system aims to reduce the risk of false signals and improve the overall profitability of the trades.
What is a Continuation indicator?
In the NNFX (No Nonsense Forex) trading system, a continuation indicator is a technical indicator that is used to confirm a current trend and predict that the trend is likely to continue in the same direction. A continuation indicator is typically used in conjunction with other indicators in the system, such as a baseline indicator, to provide a comprehensive trading strategy.
What is a Volatility/Volume indicator?
Volume indicators, such as the On Balance Volume (OBV), the Chaikin Money Flow (CMF), or the Volume Price Trend (VPT), are used to measure the amount of buying and selling activity in a market. They are based on the trading volume of the market, and can provide information about the strength of the trend. In the NNFX system, volume indicators are used to confirm trading signals generated by the Moving Average and the Relative Strength Index. Volatility indicators include Average Direction Index, Waddah Attar, and Volatility Ratio. In the NNFX trading system, volatility is a proxy for volume and vice versa.
By using volume indicators as confirmation tools, the NNFX trading system aims to reduce the risk of false signals and improve the overall profitability of trades. These indicators can provide additional information about the market that is not captured by the primary indicators, and can help traders to make more informed trading decisions. In addition, volume indicators can be used to identify potential changes in market trends and to confirm the strength of price movements.
What is an Exit indicator?
The exit indicator is used in conjunction with other indicators in the system, such as the Moving Average (MA), the Relative Strength Index (RSI), and the Average True Range (ATR), to provide a comprehensive trading strategy.
The exit indicator in the NNFX system can be any technical indicator that is deemed effective at identifying optimal exit points. Examples of exit indicators that are commonly used include the Parabolic SAR, the Average Directional Index (ADX), and the Chandelier Exit.
The purpose of the exit indicator is to identify when a trend is likely to reverse or when the market conditions have changed, signaling the need to exit a trade. By using an exit indicator, traders can manage their risk and prevent significant losses.
In the NNFX system, the exit indicator is used in conjunction with a stop loss and a take profit order to maximize profits and minimize losses. The stop loss order is used to limit the amount of loss that can be incurred if the trade goes against the trader, while the take profit order is used to lock in profits when the trade is moving in the trader's favor.
Overall, the use of an exit indicator in the NNFX trading system is an important component of a comprehensive trading strategy. It allows traders to manage their risk effectively and improve the profitability of their trades by exiting at the right time.
How does Loxx's GKD (Giga Kaleidoscope Modularized Trading System) implement the NNFX algorithm outlined above?
Loxx's GKD v1.0 system has five types of modules (indicators/strategies). These modules are:
1. GKD-BT - Backtesting module (Volatility, Number 1 in the NNFX algorithm)
2. GKD-B - Baseline module (Baseline and Volatility/Volume, Numbers 1 and 2 in the NNFX algorithm)
3. GKD-C - Confirmation 1/2 and Continuation module (Confirmation 1/2 and Continuation, Numbers 3, 4, and 5 in the NNFX algorithm)
4. GKD-V - Volatility/Volume module (Confirmation 1/2, Number 6 in the NNFX algorithm)
5. GKD-E - Exit module (Exit, Number 7 in the NNFX algorithm)
(additional module types will added in future releases)
Each module interacts with every module by passing data between modules. Data is passed between each module as described below:
GKD-B => GKD-V => GKD-C(1) => GKD-C(2) => GKD-C(Continuation) => GKD-E => GKD-BT
That is, the Baseline indicator passes its data to Volatility/Volume. The Volatility/Volume indicator passes its values to the Confirmation 1 indicator. The Confirmation 1 indicator passes its values to the Confirmation 2 indicator. The Confirmation 2 indicator passes its values to the Continuation indicator. The Continuation indicator passes its values to the Exit indicator, and finally, the Exit indicator passes its values to the Backtest strategy.
This chaining of indicators requires that each module conform to Loxx's GKD protocol, therefore allowing for the testing of every possible combination of technical indicators that make up the six components of the NNFX algorithm.
What does the application of the GKD trading system look like?
Example trading system:
Backtest: Strategy with 1-3 take profits, trailing stop loss, multiple types of PnL volatility, and 2 backtesting styles
Baseline: Hull Moving Average as shown on the chart above
Volatility/Volume: Hurst Exponent
Confirmation 1: Fisher Transform
Confirmation 2: Williams Percent Range
Continuation: Fisher Transform
Exit: Rex Oscillator
Each GKD indicator is denoted with a module identifier of either: GKD-BT, GKD-B, GKD-C, GKD-V, or GKD-E. This allows traders to understand to which module each indicator belongs and where each indicator fits into the GKD protocol chain.
Giga Kaleidoscope Modularized Trading System Signals (based on the NNFX algorithm)
Standard Entry
1. GKD-C Confirmation 1 Signal
2. GKD-B Baseline agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
4. GKD-C Confirmation 2 agrees
5. GKD-V Volatility/Volume agrees
Baseline Entry
1. GKD-B Baseline signal
2. GKD-C Confirmation 1 agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
4. GKD-C Confirmation 2 agrees
5. GKD-V Volatility/Volume agrees
6. GKD-C Confirmation 1 signal was less than 7 candles prior
Volatility/Volume Entry
1. GKD-V Volatility/Volume signal
2. GKD-C Confirmation 1 agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
4. GKD-C Confirmation 2 agrees
5. GKD-B Baseline agrees
6. GKD-C Confirmation 1 signal was less than 7 candles prior
Continuation Entry
1. Standard Entry, Baseline Entry, or Pullback; entry triggered previously
2. GKD-B Baseline hasn't crossed since entry signal trigger
3. GKD-C Confirmation Continuation Indicator signals
4. GKD-C Confirmation 1 agrees
5. GKD-B Baseline agrees
6. GKD-C Confirmation 2 agrees
1-Candle Rule Standard Entry
1. GKD-C Confirmation 1 signal
2. GKD-B Baseline agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
Next Candle:
1. Price retraced (Long: close < close or Short: close > close )
2. GKD-B Baseline agrees
3. GKD-C Confirmation 1 agrees
4. GKD-C Confirmation 2 agrees
5. GKD-V Volatility/Volume agrees
1-Candle Rule Baseline Entry
1. GKD-B Baseline signal
2. GKD-C Confirmation 1 agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
4. GKD-C Confirmation 1 signal was less than 7 candles prior
Next Candle:
1. Price retraced (Long: close < close or Short: close > close )
2. GKD-B Baseline agrees
3. GKD-C Confirmation 1 agrees
4. GKD-C Confirmation 2 agrees
5. GKD-V Volatility/Volume Agrees
1-Candle Rule Volatility/Volume Entry
1. GKD-V Volatility/Volume signal
2. GKD-C Confirmation 1 agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
4. GKD-C Confirmation 1 signal was less than 7 candles prior
Next Candle:
1. Price retraced (Long: close < close or Short: close > close)
2. GKD-B Volatility/Volume agrees
3. GKD-C Confirmation 1 agrees
4. GKD-C Confirmation 2 agrees
5. GKD-B Baseline agrees
PullBack Entry
1. GKD-B Baseline signal
2. GKD-C Confirmation 1 agrees
3. Price is beyond 1.0x Volatility of Baseline
Next Candle:
1. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
2. GKD-C Confirmation 1 agrees
3. GKD-C Confirmation 2 agrees
4. GKD-V Volatility/Volume Agrees
]█ Setting up the GKD
The GKD system involves chaining indicators together. These are the steps to set this up.
Use a GKD-C indicator alone on a chart
1. Inside the GKD-C indicator, change the "Confirmation Type" setting to "Solo Confirmation Simple"
Use a GKD-V indicator alone on a chart
**nothing, it's already useable on the chart without any settings changes
Use a GKD-B indicator alone on a chart
**nothing, it's already useable on the chart without any settings changes
Baseline (Baseline, Backtest)
1. Import the GKD-B Baseline into the GKD-BT Backtest: "Input into Volatility/Volume or Backtest (Baseline testing)"
2. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "Baseline"
Volatility/Volume (Volatility/Volume, Backte st)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Solo"
2. Inside the GKD-V indicator, change the "Signal Type" setting to "Crossing" (neither traditional nor both can be backtested)
3. Import the GKD-V indicator into the GKD-BT Backtest: "Input into C1 or Backtest"
4. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "Volatility/Volume"
5. Inside the GKD-BT Backtest, a) change the setting "Backtest Type" to "Trading" if using a directional GKD-V indicator; or, b) change the setting "Backtest Type" to "Full" if using a directional or non-directional GKD-V indicator (non-directional GKD-V can only test Longs and Shorts separately)
6. If "Backtest Type" is set to "Full": Inside the GKD-BT Backtest, change the setting "Backtest Side" to "Long" or "Short
7. If "Backtest Type" is set to "Full": To allow the system to open multiple orders at one time so you test all Longs or Shorts, open the GKD-BT Backtest, click the tab "Properties" and then insert a value of something like 10 orders into the "Pyramiding" settings. This will allow 10 orders to be opened at one time which should be enough to catch all possible Longs or Shorts.
Solo Confirmation Simple (Confirmation, Backtest)
1. Inside the GKD-C indicator, change the "Confirmation Type" setting to "Solo Confirmation Simple"
1. Import the GKD-C indicator into the GKD-BT Backtest: "Input into Backtest"
2. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "Solo Confirmation Simple"
Solo Confirmation Complex without Exits (Baseline, Volatility/Volume, Confirmation, Backtest)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Chained"
2. Import the GKD-B Baseline into the GKD-V indicator: "Input into Volatility/Volume or Backtest (Baseline testing)"
3. Inside the GKD-C indicator, change the "Confirmation Type" setting to "Solo Confirmation Complex"
4. Import the GKD-V indicator into the GKD-C indicator: "Input into C1 or Backtest"
5. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "GKD Full wo/ Exits"
6. Import the GKD-C into the GKD-BT Backtest: "Input into Exit or Backtest"
Solo Confirmation Complex with Exits (Baseline, Volatility/Volume, Confirmation, Exit, Backtest)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Chained"
2. Import the GKD-B Baseline into the GKD-V indicator: "Input into Volatility/Volume or Backtest (Baseline testing)"
3. Inside the GKD-C indicator, change the "Confirmation Type" setting to "Solo Confirmation Complex"
4. Import the GKD-V indicator into the GKD-C indicator: "Input into C1 or Backtest"
5. Import the GKD-C indicator into the GKD-E indicator: "Input into Exit"
6. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "GKD Full w/ Exits"
7. Import the GKD-E into the GKD-BT Backtest: "Input into Backtest"
Full GKD without Exits (Baseline, Volatility/Volume, Confirmation 1, Confirmation 2, Continuation, Backtest)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Chained"
2. Import the GKD-B Baseline into the GKD-V indicator: "Input into Volatility/Volume or Backtest (Baseline testing)"
3. Inside the GKD-C 1 indicator, change the "Confirmation Type" setting to "Confirmation 1"
4. Import the GKD-V indicator into the GKD-C 1 indicator: "Input into C1 or Backtest"
5. Inside the GKD-C 2 indicator, change the "Confirmation Type" setting to "Confirmation 2"
6. Import the GKD-C 1 indicator into the GKD-C 2 indicator: "Input into C2"
7. Inside the GKD-C Continuation indicator, change the "Confirmation Type" setting to "Continuation"
8. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "GKD Full wo/ Exits"
9. Import the GKD-E into the GKD-BT Backtest: "Input into Exit or Backtest"
Full GKD with Exits (Baseline, Volatility/Volume, Confirmation 1, Confirmation 2, Continuation, Exit, Backtest)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Chained"
2. Import the GKD-B Baseline into the GKD-V indicator: "Input into Volatility/Volume or Backtest (Baseline testing)"
3. Inside the GKD-C 1 indicator, change the "Confirmation Type" setting to "Confirmation 1"
4. Import the GKD-V indicator into the GKD-C 1 indicator: "Input into C1 or Backtest"
5. Inside the GKD-C 2 indicator, change the "Confirmation Type" setting to "Confirmation 2"
6. Import the GKD-C 1 indicator into the GKD-C 2 indicator: "Input into C2"
7. Inside the GKD-C Continuation indicator, change the "Confirmation Type" setting to "Continuation"
8. Import the GKD-C Continuation indicator into the GKD-E indicator: "Input into Exit"
9. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "GKD Full w/ Exits"
10. Import the GKD-E into the GKD-BT Backtest: "Input into Backtest"
Baseline + Volatility/Volume (Baseline, Volatility/Volume, Backtest)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Baseline + Volatility/Volume"
2. Inside the GKD-V indicator, make sure the "Signal Type" setting is set to "Traditional"
3. Import the GKD-B Baseline into the GKD-V indicator: "Input into Volatility/Volume or Backtest (Baseline testing)"
4. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "Baseline + Volatility/Volume"
5. Import the GKD-V into the GKD-BT Backtest: "Input into C1 or Backtest"
6. Inside the GKD-BT Backtest, change the setting "Backtest Type" to "Full". For this backtest, you must test Longs and Shorts separately
7. To allow the system to open multiple orders at one time so you can test all Longs or Shorts, open the GKD-BT Backtest, click the tab "Properties" and then insert a value of something like 10 orders into the "Pyramiding" settings. This will allow 10 orders to be opened at one time which should be enough to catch all possible Longs or Shorts.
Requirements
Outputs
Chained or Standalone: GKD-BT or GKC-V
Stack 1: GKD-C Continuation indicator
Stack 2: GKD-C Continuation indicator