Face Detection and Emotion Analysis in Flutter using Google ML Kit
FlutterPulseThis article was translated specially for the channel FlutterPulseYou'll find lots of interesting things related to Flutter on this channel. Don't hesitate to subscribe!🚀

Facial recognition is no longer the stuff of science fiction. From unlocking smartphones to measuring customer sentiment in retail, face…
Facial recognition is no longer the stuff of science fiction. From unlocking smartphones to measuring customer sentiment in retail, face detection has grown into a core feature of modern apps. With Google ML Kit and Flutter, you can bring this magic to your app — fast, offline, and on-device.
In this story, we'll explore how to integrate face detection in Flutter using ML Kit and even go a step further with basic emotion analysis.
🤔 Why Face Detection?
Face detection isn't just about recognizing a face — it can help you:
- Detect faces in live camera feed or photos
- Track multiple faces simultaneously
- Identify facial landmarks (eyes, nose, mouth)
- Analyze head rotation and smiling probability
- Build fun filters, focus tracking, emotion-aware apps
🛠️ Getting Started
Step 1: Add Dependencies
Update your pubspec.yaml with the following:
dependencies:
google_mlkit_face_detection: ^0.10.0
camera: ^0.10.5
permission_handler: ^11.0.0
Run:
flutter pub get
Step 2: Configure Android & iOS
Android
- Set
minSdkVersionto 21 or higher - Add camera permission in
AndroidManifest.xml:
<uses-permission android:name="android.permission.CAMERA"/>
iOS
Add camera usage permission in Info.plist:
<key>NSCameraUsageDescription</key>
<string>We need camera access for face detection.</string>
📷 Capturing a Live Feed
Use the camera plugin to access and stream the device's camera.
Initialize your camera controller and start the image stream:
final controller = CameraController(camera, ResolutionPreset.medium);
await controller.initialize();
await controller.startImageStream(onImageAvailable);
🧠 Detecting Faces
Set up the ML Kit face detector:
final options = FaceDetectorOptions(
enableContours: true,
enableClassification: true, // Enables smile & eye open probabilities
performanceMode: FaceDetectorMode.fast,
);
final faceDetector = FaceDetector(options: options);
Then process the incoming image stream:
Future<void> onImageAvailable(CameraImage image) async {
final inputImage = getInputImageFromCameraImage(image);
final faces = await faceDetector.processImage(inputImage);
for (Face face in faces) {
final smileProb = face.smilingProbability;
final leftEyeOpenProb = face.leftEyeOpenProbability;
print("Smile: $smileProb, Left Eye Open: $leftEyeOpenProb");
}
}Use these probabilities to infer simple emotions like:
- Happy if
smileProbability > 0.7 - Sleepy or blinking if
leftEyeOpenProbability < 0.3
😄 Emotion Detection (Basic)
While ML Kit doesn't directly label emotions like "happy", "sad", or "angry", you can infer basic emotional states by combining smile and eye openness data.
For example:
String analyzeEmotion(Face face) {
if (face.smilingProbability != null && face.smilingProbability! > 0.8) {
return "Happy";
} else if (face.leftEyeOpenProbability != null && face.leftEyeOpenProbability! < 0.3) {
return "Sleepy";
}
return "Neutral";
}For deeper emotion detection, consider integrating TensorFlow Lite with pre-trained models.
🧩 Real-World Use Cases
- Real-time selfie filters
- Smile-to-unlock apps
- Emotion-aware customer feedback tools
- Attendance tracking using face presence
- Security & access control systems
🛡️ Privacy & Ethics
Always inform users when capturing facial data. Since ML Kit processes everything on-device, you get strong privacy by design — no data leaves the user's phone.
But as a best practice:
- Prompt for explicit consent
- Avoid storing facial data unless necessary
- Be transparent in your privacy policy
🧪 Testing the Feature
Use real devices for testing face detection. Emulators generally do not support camera streams, and face detection can behave differently depending on lighting, angle, and camera quality.
🚀 What's Next?
Want to take it further?
- Detect facial expressions with custom ML models
- Build AR filters using facial landmarks
- Combine with speech input for AI chatbots
- Use TensorFlow Lite for deeper emotion classification
📣 Wrapping Up
Face detection in Flutter is easier than ever, thanks to Google ML Kit. Whether you're building a fun app or a productivity tool, facial features and basic emotion insights can make your UI more interactive and intelligent.
Let your app see what's going on — and react accordingly.
If you found this story helpful, you can support me at Buy Me a Coffee!