Most apps that you build with Firebase’s backend services, such as Realtime Database, Cloud Firestore, and Cloud Storage, need some way to sign users in: among other things, this lets you provide a consistent experience across sessions and devices, and lets you set user-specific permissions. Firebase Authentication helps you meet this requirement by providing libraries and services that you can use to quickly build a new sign-in system for your app.
But what if your organization already uses a service such as Okta to handle user identity? With Firebase Custom Authentication, you can use any user identity service (including Okta) to authenticate with Firebase, and this post will show you how.
You’ll learn how to build a Firebase and Okta integration, which will have two components:
By the way, this approach can also be used with some modification for other identity services, such as Auth0, Azure Active Directory, or your own custom system.
Ready to get started? Great! But, before you write any code, you’ll need to set up your Okta and Firebase projects.
First, set up an Okta project on the Okta Developer site:
Set the Base URIs and Login redirect URIs to the location where you plan to host your web frontend (http://localhost:5000 if you’re using the Firebase Hosting emulator) and enable the Authorization Code grant type.
http://localhost:5000
When you’re done, take note of the app's Client ID for later.
Then, set up a Firebase project in the Firebase console:
If you plan to eventually host your web app with Firebase, you can automatically set up Firebase Hosting and simplify configuration by enabling Also set up Firebase Hosting for this app.
Finally, if you plan to deploy your token exchange endpoint as a Cloud Function:
Now that your projects are set up, you’ll write the crucial piece: the token exchange endpoint.
The job of the token exchange endpoint is to take a user’s Okta access token and, if it’s valid, produce a Firebase custom authentication token that represents the same user.
This endpoint needs to be able to verify the authenticity of the Okta access token. To accomplish this, use the Express.js middleware provided in Okta’s developer documentation (reproduced below, with minor modifications):
const OKTA_ORG_URL = // Your Okta org URL const OktaJwtVerifier = require('@okta/jwt-verifier'); const oktaJwtVerifier = new OktaJwtVerifier({ issuer: `${OKTA_ORG_URL}/oauth2/default` }); // Middleware to authenticate requests with an Okta access token. const oktaAuth = async (req, res, next) => { const authHeader = req.headers.authorization || ''; const match = authHeader.match(/Bearer (.+)/); if (!match) { res.status(401); return next('Unauthorized'); } const accessToken = match[1]; try { const jwt = await oktaJwtVerifier.verifyAccessToken( accessToken, 'api://default'); req.jwt = jwt; return next(); // Pass the request on to the main route. } catch (err) { console.log(err.message); res.status(401); return next('Unauthorized'); } }
Any endpoint protected by this middleware will require a valid Okta access token in the Authorization header. If the token is valid, it will insert the decoded token into the request before passing the request along by calling next().
Authorization
next()
Now, you can write the token exchange endpoint:
const express = require('express'); const app = express(); const cors = require('cors')({origin: 'https://YOUR_DOMAIN'}); const firebaseAdmin = require('firebase-admin'); const firebaseApp = firebaseAdmin.initializeApp(); // Get a Firebase custom auth token for the authenticated Okta user. // This endpoint uses the `oktaAuth` middleware defined above to // ensure requests have a valid Okta access token. app.get('/firebaseCustomToken', [cors, oktaAuth], async (req, res) => { const oktaUid = req.jwt.claims.uid; try { const firebaseToken = await firebaseApp.auth().createCustomToken(oktaUid); res.send(firebaseToken); } catch (err) { console.log(err.message); res.status(500).send('Error minting token.'); } });
This endpoint uses the Firebase Admin SDK to mint a Firebase custom authentication token using the user’s Okta UID. When you sign a user in with this token for the first time (on the frontend), Firebase Authentication will add a user record with the same UID to your project.
This process of using an Okta access token to acquire a Firebase custom token is the key idea behind integrating Okta and Firebase. But, let’s go one step further and write a simple web frontend to demonstrate the use of the endpoint.
The demo frontend is a plain HTML and JavaScript web app that uses the Firebase Authentication Web SDK and Okta’s sign-in widget library.
Start with two containers: one for authenticated user content and one for Okta’s sign-in widget:
<div id="authenticated-user-content" hidden> <h2>Authenticated with Firebase</h2> <p id="user-info"></p> <button onclick="firebase.auth().signOut();">Sign out</button> </div> <div id="signin-widget" hidden></div>
Set up a Firebase authentication state listener that shows some user profile information to signed-in users and Okta’s sign-in widget to signed-out users:
const oktaSignIn = new OktaSignIn({ baseUrl: OKTA_ORG_URL, redirectUri: window.location.url, authParams: { display: 'page', }, el: '#signin-widget', }); firebase.auth().onAuthStateChanged((user) => { if (user) { // User is signed in. Display some user profile information. document.getElementById('user-info').innerHTML = `Hi, ${user.displayName}! Your email address is ${user.email} and your UID is ${user.uid}.`; document.getElementById('authenticated-user-content').hidden = false; document.getElementById('signin-widget').hidden = true; } else { // User is signed out. Display the Okta sign-in widget. oktaSignIn.showSignInToGetTokens({ clientId: OKTA_CLIENT_ID, redirectUri: window.location.url, getAccessToken: true, getIdToken: true, scope: 'openid profile email', }); document.getElementById('authenticated-user-content').hidden = true; document.getElementById('signin-widget').hidden = false; } });
When a user signs in with Okta’s widget, their browser briefly redirects to Okta’s authorization server, and then, assuming the user signed in successfully, redirects back to your app with the response.
Use Okta’s sign-in library to get the Okta access token from the response and use the access token to get a Firebase custom token from your token exchange endpoint:
if (oktaSignIn.hasTokensInUrl()) { // Get the access token from Okta. const oktaTokenResponse = await oktaSignIn.authClient.token.parseFromUrl(); const accessToken = oktaTokenResponse.tokens.accessToken.value; // Use the access token to call the firebaseCustomToken endpoint. const firebaseTokenResponse = await fetch(CUSTOM_TOKEN_ENDPOINT, { headers: { 'Authorization': `Bearer ${accessToken}`, } }); const firebaseToken = await firebaseTokenResponse.text(); // (Continued below.) }
And finally, authenticate with Firebase using the custom token:
// (Continued from above.) try { await firebase.auth().signInWithCustomToken(firebaseToken); } catch (err) { console.error('Error signing in with custom token.'); }
When the call to signInWithCustomToken() completes, the auth state listener will detect the change and display the user’s profile information.
signInWithCustomToken()
At this point, the user is authenticated with Firebase and you can use any of Firebase’s authentication-enabled services, such as Realtime Database, Cloud Firestore, and Cloud Storage. See the Security Rules documentation for more information on granting resource access to authenticated users.
For the complete demo app and backend that the code snippets above came from, see the Authenticate with Firebase using Okta sample on GitHub.
As many game developers know, customizing your game for different types of players is a great way to increase player engagement and retention. That’s why Firebase offers products like Analytics, Remote Config, Predictions and A/B Testing that allow you to tailor your app’s content and configuration for different player segments based on profile, past actions and future predicted behavior. For example, you can provide different onboarding flows for players based on their country or simplify a game level for players who are predicted to churn in hopes of keeping them engaged.
One key area for personalization includes monetization, but figuring out the right strategy for the right group of players can be tricky. That’s why PeopleFun, maker of some of the top word games on Android and iOS, turned to Firebase Predictions. They used Predictions in their hit game Wordscapes to create player segments based on predicted behavior and identify players who were unlikely to make in-app purchases in the next seven days. Those players were shown more rewarded video ads, while the players likely to make a purchase were not. This helped PeopleFun achieve the right balance between ads and IAP.
Read more about how PeopleFun used Firebase Predictions to increase lifetime value by up to 5% here, and check out other ways Firebase can help you supercharge your games!
In a competitive app ecosystem, making sure your app doesn’t crash frequently is integral to your app’s success. So with the graduation of Firebase Crashlytics SDK out of Beta, we think it’s a good time to highlight the benefits of integrating Crashlytics into your app. Read on for a refresher on the essential tools that Crashlytics provides to help you debug crashes and get the most out of your crash reports.
Even with access to crash reports, getting to the root cause of a crash can be pretty time consuming. Not only does the Crashlytics dashboard provide a holistic and clear view of what your users are experiencing, but you also get detailed suggestions on what could have caused a fatal error with crash insights.
Crash insights appear on your dashboard next to the crash report and provide additional context by highlighting potential root causes, such as SDK bugs and API misuse, that might be common across multiple apps. This serves as a starting point for investigation, which saves you time and speeds up your workflow.
It can be frustrating to see a user run into a crash that you can’t seem to reproduce on your end. Crashlytics can help with this by allowing you to track the state and sequence of application usage prior to a crash through custom keys and custom logs. Custom keys provide a snapshot of information at one point in time, recording the last known value; custom logs record the events a user went through during their session.
For example, you might want to know how many items a user had in their shopping cart before a crash occurred. By naming a key using a string (e.g. “item purchase count”) and setting the value programmatically, Crashlytics uploads these key/values with the next crash. These keys and values are then visible right next to your stack trace.
Even with custom keys and logs, trying to manually capture every event your user triggers in your app can be daunting. However, if you integrate Crashlytics with Google Analytics, you can automatically capture predefined Google Analytics events, known as breadcrumbs. Breadcrumbs can further enhance the data captured with custom logs, giving you even more information on the cause of a crash.
Just like custom logs and keys, breadcrumbs can be found within your stack trace in the Crashlytics dashboard, and will show the actions a user has taken prior to a crash, as well as the parameters within the event.
For instance, going back to the shopping cart example, breadcrumbs will capture event parameters like product ID, product name, type of currency used, quantity of items in the cart, etc. Here is a full list of the automatically collected events that Google Analytics breadcrumbs captures.
You never want to miss a critical user issue, but it can be tough to stay on top of crash reports around-the-clock. Using Crashlytics alerts, you can configure real-time alerts by three different levels of your app’s stability. Velocity alerts, considered high priority, are sent when an issue goes over a certain threshold within your user base. Regression alerts are sent when a previously closed issue has recurred in a new version of your app, typically medium priority. New issue alerts are sent when a new issue has occurred, and are generally low priority.
You can customize these alerts in the Crashlytics console, and receive them via Slack, PagerDuty, Jira, or email.
Not only can you view your crashes in the Crashlytics dashboard, but you can also export all Crashlytics data to BigQuery. This enables you to filter and segment your user data for further analysis. For example, you can figure out emerging crashes in new code, or see the top Issues for today so you can prioritize and fix them faster.
You can also use our Data Studio template to easily visualize this data with custom dashboards. Data Studio dashboards are easy to collaborate on and share so your team can work more efficiently; even your team members who aren't comfortable with SQL can easily maneuver BigQuery data sets.
And recently we also launched the ability to export this data in real time, enabling you to power custom workflows and alerts based on real-time data.
These are just a few examples of the exciting things you can do with Crashlytics to keep your apps stable and your users happy. As always, if you need help getting started please feel free to reach out to us directly through our Community Slack or via Stack Overflow!
Our team is driven by the belief that apps have drastically improved the way we live, work, learn, and socialize, keeping us connected to each other and plugged into the information we need. Now more than ever, we understand the importance of supporting our developer community by ensuring you have the technology and resources you need to keep your business up and running. Whether you’re a high-growth startup or a global enterprise, we’re still here to help you build and operate your app.
TensorFlow Lite is the official framework for running TensorFlow models on mobile and edge devices. It is used in many of Google’s major mobile apps, as well as applications by third-party developers. When deploying TensorFlow Lite models in production, you may come across situations where you need some support features that are not provided out-of-the-box by the framework, such as:
In these cases, instead of building your own solutions, you can leverage Firebase to quickly implement these features in just a few lines of code.
Firebase is the comprehensive app development platform by Google, which provides you infrastructure and libraries to make app development easier for both Android and iOS. Firebase Machine Learning offers multiple solutions for using machine learning in mobile applications.
In this blog post, we show you how to leverage Firebase to enhance your deployment of TensorFlow Lite models in production. We also have codelabs for both Android and iOS to show you step-by-step of how to integrate the Firebase features into your TensorFlow Lite app.
You may want to deploy your machine learning model over-the-air to your users instead of bundling it into your app binary. For example, the machine learning team who builds the model has a different release cycle with the mobile app team and they want to release new models independently with the mobile app release. In another example, you may want to lazy-load machine learning models, to save device storage for users who don’t need the ML-powered feature and reduce your app size for faster download from Play Store and App Store.
With Firebase Machine Learning, you can deploy models instantly. You can upload your TensorFlow Lite model to Firebase from the Firebase Console.
You can also upload your model to Firebase using the Firebase ML Model Management API. This is especially useful when you have a machine learning pipeline that automatically retrains models with new data and uploads them directly to Firebase. Here is a code snippet in Python to upload a TensorFlow Lite model to Firebase ML.
# Load a tflite file and upload it to Cloud Storage. source = ml.TFLiteGCSModelSource.from_tflite_model_file('example.tflite') # Create the model object. tflite_format = ml.TFLiteFormat(tflite_source=source) model = ml.Model(display_name="example_model", model_format=tflite_format) # Add the model to your Firebase project and publish it. new_model = ml.create_model(model) ml.publish_model(new_model.model_id)
Once your TensorFlow Lite model has been uploaded to Firebase, you can download it in your mobile app at any time and initialize a TensorFlow Lite interpreter with the downloaded model. Here is how you do it on Android.
val remoteModel = FirebaseCustomRemoteModel.Builder("example_model").build() // Get the last/cached model file. FirebaseModelManager.getInstance().getLatestModelFile(remoteModel) .addOnCompleteListener { task -> val modelFile = task.result if (modelFile != null) { // Initialize a TF Lite interpreter with the downloaded model. interpreter = Interpreter(modelFile) } }
There is a diverse range of mobile devices available in the market nowadays, from flagship devices with powerful chips optimized to run machine learning models to cheap devices with low-end CPUs. Therefore, your model inference speed on your users’ devices may vary largely across your user base, leaving you wondering if your model is too slow or even unusable for some of your users with low-end devices.
You can use Performance Monitoring to measure how long your model inference takes across all of your user devices. As it is impractical to have all devices available in the market for testing in advance, the best way to find out about your model performance in production is to directly measure it on user devices. Firebase Performance Monitoring is a general purpose tool for measuring performance of mobile apps, so you also can measure any arbitrary process in your app, such as pre-processing or post-processing code. Here is how you do it on Android.
// Initialize a Firebase Performance Monitoring trace val modelInferenceTrace = firebasePerformance.newTrace("model_inference") // Run inference with TensorFlow Lite interpreter.run(...) // End the Firebase Performance Monitoring trace modelInferenceTrace.stop()
Performance data measured on each user device is uploaded to Firebase server and aggregated to provide a big picture of your model performance across your user base. From the Firebase console, you can easily identify devices that demonstrate slow inference, or see how inference speed differs between OS versions.
When you iterate on your machine learning model and come up with an improved model, you may feel very eager to release it to a production right away. However, it is not rare that a model may perform well on test data but fail badly in production. Therefore, the best practice is to roll out your model to a smaller set of users, A/B test it with the original model and closely monitor how it affects your important business metrics before releasing it to all of your users.
Firebase A/B Testing enables you to run this kind of A/B testing with minimal effort. The steps required are:
Here is an example of setting up an A/B test with TensorFlow Lite models. We deliver each of two versions of our model to 50% of our user base and with the goal of optimizing for multiple metrics.
Then we change our app to fetch the model name from Firebase and use it to download the TensorFlow Lite model assigned to each device.
val remoteConfig = Firebase.remoteConfig remoteConfig.fetchAndActivate() .addOnCompleteListener(this) { task -> // Get the model name from Firebase Remote Config val modelName = remoteConfig["model_name"].asString() // Download the model from Firebase ML val remoteModel = FirebaseCustomRemoteModel.Builder(modelName).build() val manager = FirebaseModelManager.getInstance() manager.download(remoteModel).addOnCompleteListener { // Initialize a TF Lite interpreter with the downloaded model interpreter = Interpreter(modelFile) } }
After you have started the A/B test, Firebase will automatically aggregate the metrics on how your users react to different versions of your model and show you which version performs better. Once you are confident with the A/B test result, you can roll out the better version to all of your users with just one click.
Check out this codelab (Android version or iOS version) to learn step by step how to integrate these Firebase features into your app. It starts with an app that uses a TensorFlow Lite model to recognize handwritten digits and show you:
Amy Jang, Ibrahim Ulukaya, Justin Hong, Morgan Chen, Sachin Kotwani