Inside Adyen: The Log4j Saga
Since I can remember, JCenter has been the default repository for all Android libraries.
Ever since Android development was most commonly done on Eclipse IDE, when you create a new project, the template will have generated the whole project structure, which includes the Gradle files, and in those files a small statement that says
Over the years Google decided to have their own Maven repository, so they added “google()“ to that statement as well. But for simple projects, that’s all you needed. And because of that, it became common that newly developed libraries would also try to be included in JCenter for convenience. It’s a well-known open-source repository, it also mirrors Maven Central, and developers don’t need to worry about adding a new repository to their project. Easy right?
Well, what a big shock when all of a sudden we all get the news that JCenter will soon be gone! What about all the thousands of libraries that are there? How will I build my project now?
As developers, we all have tasks where we underestimate the amount of effort it’s going to require, right? That’s normal, estimations are pretty hard. We might not fully understand the requirements, we might find problems along the way, or we might simply not grasp the sheer scope of the task ahead of us. Well, for me, this task was a bit of all of the above.
And so the journey begins…
One of the beautiful things about being an Android developer is the community around it. I think the fact that the OS is Open Source has pushed the mentality of ownership to everyone in one way or another. Most people will never dig into the Framework code (although it is an interesting experience), but knowing that you can is already empowering. So no later than the next day, there were already great guides and blog posts on how to migrate your project to another widely used open-source repository, Maven Central.
The decision to move our public Android libraries (Components and 3DS2 SDK) from JCenter to Maven Central seemed like a no-brainer. But here at Adyen, we like to “include different people to sharpen our ideas”, so I started to ask around and check if this was indeed the best approach. One thing we also considered was to do what Google did and host our own repository, this would protect us from having to worry about the external service shutting down on us again.
First I reached out to our awesome Security team to explain the situation and see if they had any concerns, and talk about the pros and cons of each approach. There were a few questions about making sure that our release process was solid, and how hosting our own repository actually looks like.
So I reached out to our Infrastructure team to get a feeling for the feasibility of this approach. We had some nice discussions and they were pretty enthusiastic about the idea as well. They presented a well-structured plan on how we could use a Nexus repository with good scalability and reliability in mind. But in the end, we all agreed that this was not really necessary and that leveraging the existing infrastructure of Maven Central was a good approach for now.
All of this would have been a huge help to anyone already, and it was pretty similar to the setup we already had for JCenter so I felt pretty confident. But wait, there’s more! Márton also explains how you can:
This was simply perfect! We already use GitHub Actions to automate our CI process on Components, so I could simply follow along and adjust any improvements I could find.
This seemed pretty straightforward. I opened my Sonatype account, used the same namespace we already had set up for our Java API library, and started working on the CI adjustments. Soon enough I opened a PR and felt like this was going to be a piece of cake, just a couple more details to go. But there was one small detail I forgot to take into consideration...
What about the old artifacts? The versions that are already released.
Ah, this must surely be another common issue for Android Developers, lots of people need to migrate their libraries, so blog posts to the rescue!
That sounds pretty good, pretty straightforward. I did it once, and it worked! Brilliant! Now I just need to repeat this process…
This is where I started to worry a little bit. See, we have several modules in our library to make it more modular and each module is its own little library. Each one has several versions, and as it turns out, this scales up pretty quickly. The solution is to try and automate this process somehow.
Let me start by saying that I’m definitely not the best at scripting. My knowledge of Shell scripting is very basic at best, but this seemed like a good opportunity to learn by doing, so I started digging in.
The first step seemed to be to download all of our existing artifacts from JCenter. The most common and powerful tool to download stuff from the internet seems to be curl so with only one command I could do a lot.
First I noticed that if the URL is a folder, the repository would return a simple HTML with the links to the next folders or files. So first I try to filter that out and find the links inside it.
Then I clean up the result a bit and use it to move on.
So with this, I was able to write a small script that navigates the repository folders. and with a small assumption of what should be a file or a folder, I could either download it or navigate inside and start the script again recursively.
And in the end, I have a local folder with the same structure as the repository and all the files were downloaded.
JCenter didn’t require that the artifacts were signed, but due to changes in the release process over time, in our case, some were and some weren’t. Since this was an opportunity to start from scratch I decided to have a new key to sign all the artifacts, the old and the future ones.
So I asked our internal security team to create an email dedicated to associate with the new signature and then used it to sign all of the files I had downloaded. A similar approach to the previous script worked well, just navigate the folders and sign the files with GPG.
Now all I had to do was upload the artifacts and the signatures to Maven Central. I thought my recursive script approach would work well again. But how could I upload the files? Probably curl would help me again.
After some research, I found this post on the Sonatype support page that seemed like a good idea. With my profile ID and credentials at hand, I could make a POST call to OSSRH to create a staging repository. Then I could use that repository ID to upload the files. Something like this:
And this seemed to work fine in the beginning. I was able to open the repositories and upload the files. But then when I wanted to close the repository to publish it, OSSRH would give me errors. Invalid signatures, missing metadata. I was pretty lost at this point.
I decided to reach out to the people that had helped me without knowing until now, the blog post writers! I was already following Márton on Twitter so I decided to send him a message to see if he knew anything about this process. Unfortunately, he didn’t, but after a few days, he messaged me back with another article he had found. My hope rekindled!
“To sign and upload the artifacts to Maven Central, what better tool to use than Maven itself?”
Of course, I hadn’t even considered that. Unfortunately, I couldn’t directly use his script but it gave me a lot of insight into how I could approach the problem differently.
First I tried to upload the files with the signatures already using `mvn deploy:deploy-file`, but that didn’t work so well for some reason, so instead, I deleted my existing signature files and decided to mimic his approach.
The little trick for me was to set the new GPG key as the default signature key since I couldn’t specify the key as a parameter to Maven, so I created the configuration file for it. The file is located at ~/.gnupg/gpg.conf and contains the line default-key <your_key> in it.
I also set up my credentials in the maven configuration file like Jeroen mentions in his post and started making some tests. Luckily he also mentions this Sonatype documentation a couple of times, and that’s where I could see he did a much better job than me at researching. This helped me in finding out how to also add the sources and javadocs files to my script.
Since we have several small libraries from different older release processes, there was not a standard I could rely on. So I created a template XML file with some placeholders to replace the original POM files. Shoutout to Rodrigo Rocco here at Adyen who helped me a LOT with writing this script.
We read the original POM file to get some tags and replace them in the template file that has all the requirements.
So finally with the correct POM files, the correct signatures, the right tool, lot of help from a lot of people, running the upload script for approximately 8 hours, a result log file with 130.000 lines, all the 1134 versions have been successfully uploaded to Maven Central.
It’s been a wild ride, but I learned a lot in the process. As an Android Developer, this is not what we are used to dealing with every day, but it’s exactly the kind of challenge that makes us learn and grow as professionals. I’m super happy that here at Adyen we get the ownership and the encouragement to face them head-on, and also get to have fun while doing it.
By submitting this form, you acknowledge that you have reviewed the terms of our Privacy Statement and consent to the use of data in accordance therewith.