Documentation of Digital Reporting System Development from Android side (1)

Andrea
4 min readJul 21, 2024

It cannot be denied that there are still many systems in Indonesia that are being done manually such as recording, reporting, etc. However, if these systems were digitised, they would provide better effectiveness and efficiency.

On this occasion, I intend to document my experience in assisting with the digitalization of reporting for one unit within the Indonesian police force. This documentation will narrate the journey of creating the application from the Android side.

Phase 1 (Nov 2021 — Jul 2022)
The work is carried out by a team consisting of 1 project coordinator, 1–2 Product Designers, 2–3 Frontend Developers, 2–3 Backend Developers, 2 Android Developers.

Application look on early phase
Application look on early phase

The expected system is a reporting system through an Android application, where the Android application will be attached to each patrol vehicle. Therefore, the application is expected to have features:
- Reporting tasks. Reports are expected in the form of photos, videos, or audio.
- Tracking the position of each vehicle.
- Receiving task descriptions when needed.
- The application is also expected to be able to continue running even without an internet connection.

In the development for the Android side, MVVM architecture is used. Where the business logic is stored in the Repository and called by the View Model through the Fragment.

The reporting data itself will be uploaded to a specific server via API and then the reports data from various devices will be displayed through a web dashboard.

Some libraries/frameworks that are used:
- Retrofit & OkHttp for API usage/calls
- Traccar for tracking the position of each device
- RabbitMQ sends messages to the application which are then translated into pop-up notifications
- WorkManager handles requests when internet connection is not available.
- AgoraIO library for live streaming, online meetings, calls/agent help.
- Hilt for dependency injection.

And also some UI-related libraries such as CameraView, Exoplayer, NaraeAudioRecorder, Fishbun, Google Places.

In the process, some libraries were adapted & modified (such as Fishbun & NaraeAudioRecorder) to fit the project’s needs.

Fishbun, which is a library for file picker, had its appearance changed to match the application’s design. Thus, the library was modified and embedded as a separate module within the application.

Meanwhile, NaraeAudioRecorder was modified to be able to accept audio in OGG format. The modified library was republished on MavenCentral using my personal github, so it can be added as a dependency.

Application final look
Application final look

In closing, from the Android side, in my opinion, there are several improvements that can be done, if there is enough time and energy:
1. Writing unit tests for crucial methods
2. Migrating old UI to Jetpack Compose
3. Abstraction for classes that are almost similar (for example Fragment, Repository)

System being used in real

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Andrea
Andrea

No responses yet

Write a response