Dark feature release with zero-second latency and zero-net effect in React
All featureflow users know that If you want to add a new feature to your react application, you can put it behind a feature flag, release silently, then choose your rollout plan to suite. You can run internal or beta testing first, get feedback from a cohort or users, validate your KPIs before rolling out to all.
Thats the magic of feature flags. But there are trade-offs.
The cost of hidden features
Having lots of dark features on a page (features that are not visible to the main cohort of users) can create unnecessary bloat, generally a couple of hidden components its fine, but if the new feature might being in a new library for example, then the effect can be tangible. Why download a bunch of Components and libraries that are never going to be made visible?
In addition, I have found that with other feature flag tools client SDKs such as Launch Darkly you are forced to wait until the flags load before rendering the page (or have ‘flashing’ defaults) – this adds a bottleneck to the entire load experience.
Well in react 17 a great new feature made this problem super simple to solve, combined with the unique capabilities of the featureflow client SDKs you can have your cake and eat it.
“Dark feature testing with Zero Net effect and Zero Second latency”
Fixing the package size by code splitting
The React.lazy method makes it really easy to code-split an application at the component level by making use of dynamic imports, for example:
import React, { lazy } from 'react';
const ExperimentalButtonComponent = lazy(() => import('./ExperimentalButtonComponent'));
const PageComponent = () => (
<div>
<ExperimentalButtonComponent />
</div>
)
So if you have a new feature that you would like to silently introduce without affecting the initial load of the main codebase then you can wrap it in lazy – this is especially useful if that component makes sole use of a third party library for example. The code-splitting logic is smart enough to know not to include that third-party dependency in the main javascript chunk also.
Flagging that
To manage this load, simply wrap it in a featureflow flag:
import React, { lazy } from 'react';
const featureflow = useFeatureflow();
const ExperimentalButtonComponent = lazy(() => import('./ExperimentalButtonComponent'));
const PageComponent = () => (
<div>
{featurewflow.evalutate('experimental-button-component').isOn() &&
<ExperimentalButtonComponent />
}
</div>
)
Suspense
There will always be a slight delay that users will have to deal with when a code-split component is retrieved, so even if it is an internal beta, it is important to handle the loading state. This is where <Suspense> comes in.
import React, { lazy } from 'react';
import { useFeatureflow } from 'react-featureflow';
const featureflow = useFeatureflow();
const ExperimentalButtonComponent = lazy(() => import('./ExperimentalButtonComponent'));
const renderLoading = () => <p>Loading</p>;
const PageComponent = () => (
<div>
{featurewflow.evalutate('experimental-button-component').isOn() &&
<Suspense fallback={renderLoading()}>
<ExperimentalButtonComponent />
</Suspense>
}
</div>
)
Suspense takes a fallback component which enables you to show another component whilst the lazy component is loading. So the experience is complete for your beta or internal testers and because the new component is asynchronously loaded, if the flag is off for the main cohort of users, then the component will never get loaded, therefor there is zero effect on the bundle size.
The feature flag in the room
But wait – I said zero net effect on bundle size and latency – we still have to wait for the flag to evaluate right?
This is where featureflow unique Zero Latency evaluation comes in:
Using the Server-Side SDK to pass back flags in the initial load
The Featureflow javascript SDK works hand in hand with your server SDK. You can pass the initial values back with your user object or the initial page, after which the client SDK takes over managing live updates and evaluations.
- If you haven’t already, go and sign up for featureflow – it takes 1 minute and is free
- Create a feature flag
- Add the featureflow SDK to your project
- Pass back your flag values in the main page header or with your user object
- Pass those values to the featureflow.init call
Featureflow server-side evaluations happen on your server, in memory, with virtually zero latency. Passing the initial values back with your page or user object utilises the data calls you already have in motion which negates the initial call from featureflow, so even on a brand new load with a new user, the featureflow client SDK is primed with the features it needs.
On the server – you can use featureflow.getClientFeatures(user); to get feature rules evaluated specifically for the front-end SDK, store these values in your initial page load or user call – e.g. in user.features. Then in your react application you can do:
import { withFeatureflowProvider, useFeatureflow, useFeatures } from 'react-featureflow-client'
const features = user.features;
const FF_KEY = 'js-env-YOUR_KEY_HERE';
const user = {
attributes: {
tier: 'gold',
country: 'australia',
roles: ['role1', 'role2']
}
};
export default (withFeatureflowProvider({
apiKey: FF_KEY,
config: {
streaming: true,
},
features,
user
})(App))
import React, { lazy } from 'react';
import { useFeatureflow } from 'react-featureflow';
const featureflow = useFeatureflow();
const ExperimentalButtonComponent = lazy(() => import('./ExperimentalButtonComponent'));
const renderLoading = () => <p>Loading</p>;
const App = () => {
return {
}
}
const PageComponent = () => (
<div>
{featurewflow.evalutate('experimental-button-component').isOn() &&
<Suspense fallback={renderLoading()}>
<ExperimentalButtonComponent />
</Suspense>
}
</div>
)
Using the Client SDK
The client side-SDK (built in to the React SDK) is smart.
The client side SDK has numerous caching features to ensure evaluations are near-zero latency:
- Calls to featureflow are cached in a globally available CDN
- Calls utilise e-tag headers to ensure minimal throughput and latency on the majority of calls
- Featureflow uses a unique smart-evaluation to ensure even if features have time-based rules set (where your user context data may change frequently), caching still applies. this makes featureflow the most efficient javascript SDK on the market.
- Featureflow uses localstorage and browser caching to reduce subsequent loads instantaneous
To use the client SDK in Client mode, simply instantiate as per the guidelines in https://www.npmjs.com/package/react-featureflow-client
import { withFeatureflowProvider, useFeatureflow, useFeatures } from 'react-featureflow-client'
const FF_KEY = 'js-env-YOUR_KEY_HERE';
const user = {
attributes: {
tier: 'gold',
country: 'australia',
roles: ['role1', 'role2']
}
};
export default (withFeatureflowProvider({
apiKey: FF_KEY,
config: {
streaming: true,
},
user
})(App))
Thats the magic of using our SDKs – we take care of all that complexity.
For more info, check us out at http://www.featureflow.io
Our docs at https://docs.featureflow.io/
and the React SDK on Github – https://github.com/featureflow/react-featureflow-client