A small app with a great purpose. That’s the best way to describe SnapTrash, one of our latest projects and one of the first we’re developing since we’ve made the shift from Ruby on Rails to a full JavaScript stack. Here you’ll get to know this iOS app, as well as the development process on the base of its first version, to be available soon on the App Store.

What is SnapTrash and why is it relevant

SnapTrash Interface

SnapTrash is an iOS app that allows you to report plastic debris that may be found on beaches by using only your phone’s camera and GPS. Every photo is uploaded to a database that is accessible from the app and contains the photo location, its coordinates and the username of the uploader.

This database will be used to organize regular plastic collections, helping to keep the beaches clean and the ocean plastic-free. Its relevancy is as great as its purpose, as eight million tons of plastic enter the ocean every year, being responsible for the death of several marine wildlife species that ingest it daily.

The user will be able to get a global view of all reported photos on a map that is integrated in the app through a set of intuitive icons. Although the main goal of the app is to organize collections of large amounts of plastic, the user is also encouraged to take action by himself whenever it’s possible.

Designed with simplicity in mind

The design challenge with SnapTrash was to make it as simple and intuitive as possible, without wasting too much time in details that would add no value to it. For this reason, the interface is as simple and clean as possible, making it very easy to achieve the desired tasks. After the login, it only takes a couple of clicks to report any photo, something that is as simple as sharing media on Instagram or Snapchat.

Bin bags on the map represent a gallery of photos at any given location and if you zoom it, you'll see them spread into each individual photo. The remaining icons and buttons don't take much space on the screen and were designed to keep the user's attention on SnapTrash's main purpose: to keep the ocean plastic-free.

SnapTrash User Experience

Developing SnapTrash

From the start, my task was to build a mobile app where users could log in through Facebook and take pictures of plastic, publishing them on a map. At the time, I was still experimenting with Strapi (you can find it reviewed here) and it seemed to be fit for this task. To develop the app itself, I went with React Native.

The server side: Strapi

To be able to keep the photos visible on the app, I had to find a way to store them in a server, along with all the relevant information about each one. At this stage, the server-side is not visible to users in general, and only works as a data storage.

To achieve this, Strapi seemed like a good tool for the job, with one limitation only: it’s geared towards web app development. Consequently, the SDK does not fully work in React Native, nor lets you use the SDK for native Facebook login. In any case, the Strapi Facebook login is now broken, since Facebook made HTTPS callback mandatory and Strapi only generates callback URLs with HTTP.

In the end, the image files are uploaded to the Strapi server, through the /upload endpoint, and the photo model references that file. Additionally, it stores the location where the photo was taken and the login info from Facebook (token, user email and user name). That was everything that I needed from the server-side.

The iOS app: React Native

Regarding the app itself, my first task was simply to display a map, which sounded easy enough. However, if you go bare-bones with the React Native CLI tool, you are in for great trouble. The instructions to setup react-native-maps are incomprehensible for someone who isn't experienced in developing for Native. Luckily, the Ignite CLI has a plugin that makes it very easy to add maps to your app.

To get the photo's location, I had to track the user’s current location, which might be easier said than done, considering the available documentation. Here you’ll find bad documentation on how to do it. However, in both cases, they make it seem much more intuitive than it actually is.

After I get the map up and running, I had a few more issues that I had to solve in order to have SnapTrash ready for testing:

  • C++ exception (not Objective-C nor a Swift error, C++!): When I first tried to add markers with the bin bag icons to the map with react-native-maps, I experienced a few unwelcomed issues. The solution that I’ve found was to add the images twice to the view. The first time was before the map view (which will then be hidden behind the map) and the second time on the marker components.

  • Clustering: When it came to clustering images in groups, which was represented by a single bin bag, the best solution was to use react-native-maps-super-cluster. I tried both react-native-clustering and react-native-cluster before and found them to be much harder to install or to have documentation that wasn't very clear.

  • Overlay: The overlay is what enables the fullscreen view over the map and that's where I faced yet another issue, as I could not figure out how to put two views with position:absolute on top of each other (overlay over map). I had to check a beta version of react-native-elements, where I found an overlay component that I could use as a base for the photo slider, which takes me to the next issue.

  • Photo Slider: I had to change the overlay to remove the TouchableWithoutFeedback components, as they were capturing the events that the slider needs to scroll. When using the debugger, the slider worked great, but when it was off, the TouchableWithoutFeedback components were a problem. With react-native-carousel, I was able to get the slider to do exactly what was planned by the designer for the app.

  • 3D Touch: Touchables don't work well with 3D Touch screens when an inner view has position:absolute. The solution was to wrap the touchable in a view that positions the button properly.

Instead of:

<Touchable>
  <View style={{ position: 'absolute', ...}}>
    <Image .../>
  </View>
</Touchable> 

The solution is to push the position:absolute view outside of Touchable:

<View style={{ position: 'absolute', ...}}>
  <Touchable>
    <Image .../>
  </Touchable>
</View>
  • Styling: I tried to create the camera shot button with two views: one outer view with opacity 0.5 and one inner view with opacity 1. However, this way, the outer opacity overrides the inner opacity. In the end, I found that there was a work around it, by exporting the images from Sketch.

Finding the best solutions for these issues was one of the best parts of developing SnapTrash, as I could understand exactly why it was crucial to have everything running as smoothly as possible. The user experience should not be ruined by smaller issues that would distance the app from its goals.

Final Remarks

SnapTrash was a great project to work on and it’s an app with a very important purpose, as it addresses one of the most serious problems that we’re facing currently at a global level. Regarding the development, it was exciting to work with a few new tools, as this was one of the first mobile app developments that I've done. The following are the main takeaways that I've kept from this project.

  • Be careful when you're using Strapi. Even though, in my experience, it was simple enough, there were still a few issues to report. It does not support authentication providers that use HTTPS nor does the current SDK support React Native. In its defence, Strapi is still in alpha (I used alpha-14) and there's yet a lot of room to improve.

  • Regarding React Native, you’ll have a bad time if you’ve never developed natively, something that, given the lack of documentation, is assumed from the beginning. This will cause you to have a few errors from the native side and you’ll have no clue as how to solve them right away.

  • Some packages, like react-native-maps, are a mess to add to your project, as they require you to meddle in native configuration files, adding packages manually to the native side. Luckily, some of these pains are already addressed by the ignite boilerplates.

  • The styling looks like CSS, except it isn’t. For instance, to put two divs with absolute:position on top of each other, I had to use an overlay component from react-native-elements. It's not comparable with pure native performance, but even taking that into account, sometimes the navigation takes too long. For example, if one could press the camera button fast enough, it would push two camera screens to the navigation stack.

All in all, those issues are all far behind now and SnapTrash will soon be released. The experience of developing it was fantastic, but I firmly believe that the outcome will be even greater.

If you’re looking for help with mobile or web app development, our team is looking forward to meet you! Drop us a line here!

Ready for a UX Audit? Book a free call

Found this article useful? You might like these ones too!