Where does Flutter end and the platform begin?

The heart of the Flutter framework (which to developers is often represented only as a set of classes in the Dart language) is written in C and compiled into a binary artifact known as the Flutter Engine, which is connected to the application and used from Dart classes through the native binding mechanism (annotations). @pragma('vm:entry-point') for calls from Flutter Engine to Dart, @Native And external to access C++ code in Flutter Engine from Dart).

But in reality, Flutter Engine does not have any platform-specific code (it is assembled for the target hardware architecture) and does not know how the platform event loop works, how to create threads, on which surface to render the scene and cannot receive information about user actions ( touching the screen, moving the mouse pointer, pressing keys) and system events.
This architectural solution was made in order to be able to run Flutter applications potentially on any device with a screen (even an LED panel). In this article we will talk about Flutter Embedder, its role in launching an application and binding to system event loops, and also consider assembling a simple embedder for publishing a Flutter application as a VNC server.

Before moving to Flutter, let's look at a simple C++ application that will run on Linux and use the capabilities of the GTK+ library to control visual elements in the application window. First of all, we need to define the main function (the entry point to the application), which will need to call a sequence of methods from the library libgtk to configure the GTK context:

  • create an application window (set title, size);

  • get a surface to draw on the window (DrawingArea);

  • register callback functions to handle mouse events and keystrokes;

  • attach new frame event processing to prepare the next image (for example, during animation);

  • if necessary, create additional threads to perform computational tasks (not blocking animation) and I/O operations (file processing and network communication).

After this, the general logic of the application is based on changing state when receiving external events and preparing a sequence of commands to create an interface on a new frame (if necessary and changes have been made).
Since the responsibility for creating the UI is on the side of the Dart code (Flutter framework) and the processing of user events is also tied to Dart (Listener and gesture arena), then a similar task in a Flutter application can be solved as follows:

  • create Dart Runtime and pass the executable code and precompiled constants into it (snapshot) our application and the Dart part of the Flutter Framework;

  • get the necessary system bindings (event loop, frame scheduler), subscribe to pointer (mouse or finger touch) and keyboard events and, when they occur, transmit information to Dart Runtime;

  • For each new frame, call a function from Dart, which will initiate the passage of the Build -> Render pipeline.

In reality, between the Dart Runtime and the base launch code (the functions of which are performed by the embedder) is the Flutter Engine, which internally includes the Dart VM and a set of classes responsible for handling events and rendering the scene on various target platforms (for example, in a Framebuffer or using OpenGL/Angle). Flutter Engine provides a set of functions for the embedder and methods for registering callback functions, which are described in the ABI flutter_embedder.h.
The file can be obtained either as a result of assembling your own Flutter Engine, or from pre-built engines for different architectures. this link. Please note that to use Flutter Engine with arm64 architecture (for example, when running MacOS Desktop applications with M1 processors and newer), the assembled engine can be obtained from this repository.

It is very important that the version of the Flutter Engine that will be used to run exactly matches the version of the Flutter SDK that was used to compile the source code into the snapshot. The current version of Flutter Engine can be extracted from the file bin/internal/engine.version.
Let's take Flutter 3.19.4 as an example, the hash code for the version is a5c24f538d05aaf66f7972fb23959d8cafb9f95a. Let's find a suitable engine for our operating system and hardware architecture, download the archive and extract several files from it:

  • Flutter library binary artifact (in the case of Windows – DLLLinux/Android – soMacOS/iOS – Framework).

  • embedder.h for use from our application startup code

  • icudtl.dat Flutter Engine launch data

How to build Flutter Engine?

To compile the Flutter engine, the Ninja assembly system is used, which is configured through the gn utility. The entire package of build tools can be obtained from depot_tools:

git clone https://chromium.googlesource.com/chromium/tools/depot_tools.git
export PATH=`pwd`/depot_tools:$PATH

Then the source codes of Flutter Engine and all third-party packages can be extracted with the command:

fetch flutter
cd src

When used this way, the latest version of flutter is retrieved from the main branch in github (you can change the generated .gclient version in the repository link, for example add @3.16.1 and execute after that gclient sync). Next you need to prepare the build tools and set the configuration flags in gn. Here the target platform and hardware architecture can be defined and experimental features enabled (e.g. --enable-impeller-3d), the Dart SDK was replaced with a custom one, and the generation of examples for various embedders (including a simple GLFW with OpenGL support) was added. For example, to build embedder for MacOS arm64 with impeller-3d support, you can use the following combination of flags:

flutter/tools/gn --target-os mac --mac-cpu arm64 --enable-impeller-3d --unoptimized

As a result, a framework will be created (FlutterMacOS.framework) for linking with embedder, as well as a large number of utilities (including benchmarks and unit tests) for compiling code and shaders. File flutter_embedder.h can be extracted from github.

Create your own embedder

Let us consider the main groups of functions defined in flutter_embedder.h:

Starting and stopping Flutter Engine

  • at startup FlutterEngineRun command line arguments and a data structure with the configuration of the surface for drawing and a large number of callback methods for receiving notifications from the Flutter Engine (for example, to intercept messages output to the console) are passed. As a result of execution, the structure is filled FlutterEngine, which will be used in all subsequent calls. Calls methods under the hood FlutterEngineInitialize (preparing the engine for work) and FlutterEngineRunInitialized (launch the main Dart application from the function main). Using separated methods is useful when there is a need to define custom TaskRunner before any code starts executing. Additionally, you can transfer user_dataan arbitrary structure that will be passed as a pointer to all callback methods.

  • prepare to stop `FlutterEngineDeinitialize' stops all scheduled tasks from running and prepares the engine to stop

  • stop FlutterEngineShutdown takes structure FlutterEngine, frees up resources previously allocated to the Flutter engine. This action is usually performed when the application is terminated (or when leaving the screen where the Flutter application is being used).

To run embedder, it is necessary to fill out the data structure with the configuration FlutterRendererConfigwhich includes the following fields:

  • type driver type for rendering the scene (the simplest without acceleration support – FlutterRendererType.kSoftwareit could also be kOpenGL, kMetal, kVulkan)

  • one of the driver configuration structures: FlutterSoftwareRendererConfig, FlutterOpenGLRendererConfig, FlutterMetalRendererConfig, FlutterMetalRendererConfig). In our example, we will use Software Rendering, but on platforms that support acceleration it is more rational to use the capabilities of the corresponding libraries). For FlutterSoftwareRendererConfig only one callback method is defined surface_present_callbackwhich accepts a pixel buffer (framebuffer) and is responsible for transferring pixel colors to the display device.

In addition to the renderer configuration, an object must be passed FlutterProjectArgsdefining the number and content of command line arguments, path to assets And ICU (file icudtl.dat, distributed along with the Flutter Engine archive). Catalog assets defined relative to the project root in the directory build/flutter_assets (will be created when compiled via flutter build bundle).

Let's look at a simple embedder example that will run Flutter Engine on MacOS:

#include <sstream>
#include <vector>

int main(int argc, char** argv) { 
  FlutterRendererConfig config = {};
  config.type = kSoftware;
  config.software.struct_size = sizeof(FlutterSoftwareRendererConfig);
  config.software.surface_present_callback = present_callback;

  std::vector<const char*> engine_command_line_args = {
      "--disable-observatory",
      "--dart-non-checked-mode",
  };
  FlutterProjectArgs args = {
      .struct_size = sizeof(FlutterProjectArgs),
      .assets_path = argv[1] ,
      .icu_data_path = argv[2],
      .command_line_argc = static_cast<int>(engine_command_line_args.size()),
      .command_line_argv = engine_command_line_args.data(),
  };
  FlutterEngine engine = nullptr;
  auto result = FlutterEngineRun(FLUTTER_ENGINE_VERSION, &config, &args, this, &engine_);
  if (result != kSuccess) {
    std::cout << "Error in starting flutter engine" << std::endl
    return 1;
  }
  //----------------------------------
  //здесь мы будем рисовать
  //----------------------------------
  auto result = FlutterEngineShutdown(engine_);

  if (result != kSuccess) {
    std::cout << "Error in shutting down." << std::endl;
  }
}

To run, let’s compile our project and pass two arguments to the command line:

cd /project
flutter build bundle
cd /embedder
myembedder /project/build/flutter_assets `pwd`/icudtl.dat

So we started the Flutter Engine (within which the Dart VM will be running and a snapshot from the assets directory will be loaded) and configured it to programmatically display (without using an accelerator) through a framebuffer, which will be passed to the present_callback function, this is enough for the application to start transmitting the image obtained when rendering the scene. Now let's look at how to transmit user and system events (if a graphical interface is used, they can be intercepted via the GTK Event Loop; in other cases, the event source can be an interrupt or another mechanism for polling input devices).

Event messages

  • FlutterEngineSendWindowMetricsEvent – report changes to the display area metric (for example, changing the window size) through the structure FlutterWindowMetricsEvent;

  • FlutterEngineSendPointerEvent – send a pointer event (for example, a finger touching the screen or moving a mouse, it can also be any touchpad event where two-dimensional coordinates can be determined), the structure is used FlutterPointerEvent (coordinates, event time, device type, button type, scroll position, displacement, scaling and rotation, the identifier of the View that received the event can also be passed, this is used with platform Views);

  • FlutterEngineSendKeyEvent – transmit the event of pressing/releasing a button, works asynchronously (on completion of processing, calls callback and transmits user_datapassed during engine initialization);

  • FlutterEngineUpdateSemanticsEnabled – enable or disable support for accessibility subsystem capabilities;

  • FlutterEngineUpdateAccessibilityFeatures – determine the capabilities of the accessibility subsystem used (bit mask);

  • FlutterEngineDispatchSemanticsAction – send a semantic action (FlutterSemanticAction action) for the specified semantic node (node_id). In real life, embedder is used to transmit actions from accessibility applications (for example, voice control or swiping in TalkBack), but can also be used to transmit messages about interaction with semantic tree nodes from special control devices or sensors;

  • FlutterEngineReloadSystemFonts – a message in Flutter Engine about the need to re-read the system fonts;

  • FlutterEngineUpdateLocales – update data for system locales (a list of available locales is transmitted in order of preference);

  • FlutterEngineNotifyLowMemoryWarning – report to Flutter Engine about insufficient virtual memory;

  • FlutterEngineNotifyDisplayUpdate – changing the list of surfaces available for display (used, for example, for folding devices where there are two virtual displays). Display information describes the screen resolution, frame rate, and the ratio between logical and physical pixels.

Frame synchronization

  • FlutterEngineOnVsync – inform the Flutter Engine about the need to draw the next frame (run the Build Owner pipeline), the exact start time of the next frame is transmitted (to obtain the time from the Flutter Engine, you can use the method FlutterEngineGetCurrentTime);

  • FlutterEngineScheduleFrame – schedule forced rendering of the frame (for example, can be used if the texture or other factors affecting the display of the scene have changed);

  • FlutterEngineSetNextFrameCallback – register a callback that will be called from the Flutter Engine after the frame is processed (for example, statistics can be calculated in this place).

Getting information from Flutter Engine

  • FlutterEngineGetCurrentTime – return the current system time (from Flutter Engine's point of view)

  • FlutterEngineRunsAOTCompiledDartCode – check the operating mode of the code, returns false in JIT mode (debug), true – in AOT mode (profile/release)

These functions will be enough for us to report changes in window size, transmit codes for keys pressed and interaction with the drawing surface. All we need to do is to attach event processing when initializing GTK and transfer information from them to structures compatible with Flutter Engine (described in embedder.h), for example, to handle screen touch events, you can use the following handler:

bool SendFlutterPointerEvent(FlutterPointerPhase phase,
                                                 double x,
                                                 double y) {
  FlutterPointerEvent event = {};
  event.struct_size = sizeof(event);
  event.phase = phase;
  event.x = x;
  event.y = y;
  event.timestamp =
      std::chrono::duration_cast<std::chrono::microseconds>(
          std::chrono::high_resolution_clock::now().time_since_epoch())
          .count();
  return FlutterEngineSendPointerEvent(engine_, &event, 1) == kSuccess;
}

Here phase you can define the comparison of the last button identifier with the previous one (when you click on the screen, the left mouse button event and phase are emulated kDownwhen moving with or without the button held down – kMovewhen releasing the button – kUp).

Similarly, you can subscribe to resizing events and send them to Flutter Engine, filling in the corresponding structure:

bool SetWindowSize(size_t width, size_t height) {
  FlutterWindowMetricsEvent event = {};
  event.struct_size = sizeof(event);
  event.width = width;
  event.height = height;
  event.pixel_ratio = 1.0;
  return FlutterEngineSendWindowMetricsEvent(engine_, &event) == kSuccess;
}

In the example, method calls are bound to callback methods for rfb (created through the vncserver library); if GTK is used, you can monitor the occurrence of GdkEventMotion and other events and turn them into corresponding calls to the Flutter engine. The full source code for the project can be found at repositories.

To use OpenGL and other 3D libraries, the configuration is more complex, using a set of instructions and composition rules. An example of connecting OpenGL can be found in the GLFW implementation Here.

Which embedders have already been created?

In addition to standard implementations of Embedder + Shell (used to interact with platform technology – Java/Kotlin and Swift), there are several custom implementations of Flutter Embedder:

  • eLinux – a set of tools + embedder for running Flutter on embedded devices (can work on Wayland, X11, GBM and EGLStream), also used to run Flutter applications in the AGL project.

  • Flutter PI – embedder for running flutter applications on Raspberry PI microcomputers

  • https://github.com/flutter-tizen/embedder – to run on the Tizen platform (Samsung)

In this part of the article, we looked at the main functions of Flutter Embedder using the example of implementing a VNC server as the basis for launching Flutter applications. In the second part of the article we will look at the use of textures (can be displayed via a widget Texture), sending and processing messages through platform channels, managing the Task Runner and launching tasks from the embedder side (as well as mechanisms for linking the Task Runner and platform threads), ways to interact with isolates, collecting traces and statistics from the Dart VM.

Finally, I would like to invite you to free webinarwhere I will tell you how to create a multiplayer Imaginarium-type game with artificial intelligence in Flutter. Registerit will be interesting.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *