[Devember2021] Creating a home for the Level One NFT

Morning all,
This is my first Devember and first personal coding project since I left university. I have been spurred into action by my recent purchase of the Level One NFT. I believe that for the common good there should be a showcase for the Level One NFT so people can come and admire it.

The long rambly bit about project plans

So that’s the MVP, a nice simple site. Then I got to thinking, NFTs are too commercial; The core reason people want to spend money on NFTs seems to be because of the historical significance. The stretch goals then are focused around creating a free kind of token where verified individuals (via oauth2 subject claims) can mint simple tokens that are verified server side instead of on a blockchain. Transfer of ownership can be done with single use URLs. I will gatekeep verification personally to prevent fakes and randos getting verified.
The next stretch goal is around a Twitch integration so tokens can be awarded to chat. I think for such an integration it’d be important to have a limited run mechanism so, for example, 500 of the same image can be offered with a numbered watermark, 069/500.
As a very far away stretch goal I do hope to add the capability to convert a token to a full fledged NFT where the individual only needs to pay transaction fees imposed by the minting process. I’ll make sure I capture enough info at minting time so any token in the future can be added to the blockchain.

Architecture

I am a .net, C# developer through and through. As such the architecture will be overkill just so I can do as much as possible in C#. I’ll also do as much hosting as possible in Linode.


(Living doc is here where I will put wireframes and other such visual things)
We have a lot going on here so I’ll only talk in broad terms so this doesn’t turn into war and peace.
On the left we have requests coming in. All requests go through Cloudflare for free SSL… there are other reasons but I’m lazy and this is the one I care about.
We then come into a web application firewall (WAF) of some kind to prevent traffic coming from anywhere but Cloudflare. I’ve never done a WAF before so the whole process will be a learning experience.
We then have three domains for various Blazor web assembly (static content websites) and dotnet 6 APIs. These are all managed by the Linode Kubernetes (God help me) offering using dockerhub containers. Docker and Kubernetes are both completely new to me, I am but a lowly scrublord developer so this will be the biggest challenge. I still think it’s the best option because I don’t want the stress and headache of server maintenance; I’m happy to pay the premium for it to be someone else’s problem.
On the bottom we have an unknown logging solution. I want App Insights but that’s Azure only so I need to find an alternative. Then we have Mongo DB so I don’t have to maintain schema and a Linode Bucket thing to put all the uploaded images.
Along the top we have Auth0 handling all the difficult federated identity stuff. I won’t be having local accounts so I don’t need to touch a password. I’ll just store subject claims as my user ids and cache names so I can display them on the token history.
The other item on the top is the twitch integration piece. I really don’t know how this will work so lots of question marks in that area.
Finally on the right we have all the nuget packages I’ll be making to reduce code duplication. I’ll probably also make Refit client packages so the APIs will self document via interfaces how to communicate with themselves.

The end bit

There is a lot here but I’m confident in delivering the MVP by the end of December. I have already got https://r5k.page/ hooked up with the Blazor starter template hosted in Kubernetes and I’ll aim to get the rest of the unknowns done by the end of the month. November should then be a straight run of what I know and that leaves December for figuring out Twitch.
Thank you for listening to my Ted talk, I will now accept questions and comments.

EDIT:: Just trying to add the devember tag

1 Like

Hi all,
I’m a bit late with this update. I honestly had a lot of failure over the past week and wanted to at least get something working before writing the update.

Winning would be too easy

My first failure was minor and happened when I tried to edit this post to add the devember2021 tag. For some reason the tag hasn’t been added and the post went back to the top ¯\(ツ)/¯. So I guess this won’t have the tag.

Next we have the logging. The ultimate goal for this project was to go full Linode. The built in logging service provided by Linode is “Longview”, I’m sure that it’s very nice but it’s limited to basic system metrics like CPU usage.
Then we have the marketplace where I found Grafana and Prometheus. Being a developer pleb I was immediately confused by the intricacies of this solution. First I needed to have the networking set up to allow Prometheus access to all the instances of my application and presumably a data bucket. I’d also then separately need to set Grafana up to communicate with Prometheus.
There was a lot of back and forth as I just didn’t like my logging solution being aware and having access to anything generating logs. I tried to swap Prometheus out with Tempo on some beta OpenTelemetry stuff only to get half a solution and no documentation on how to set up Tempo.
For now I’ve admitted defeat and saved basically £10 per month by using application insights. I’ll revisit this in 2022 or if anyone has any suggestions I’m happy to spend some time looking at something else. Just bear in mind what any alternative is competing with and that the goal here is to satisfy number 9 on the OWASP top 10… Without bankrupting me over a meme.

Progress

So, that’s enough failure for now. It was a rough blow to come crawling back to papa Microsoft, but work must continue; I have a completely overcomplicated website to build that literally dozens of people will visit. We can start with some pretty colour coding on the architecture diagram:


Immediately here you can see lots of red things I’ve not started yet. The big thing I need to achieve next week is making anything for MVP yellow. I just need public git repos and nuget packages for them along with a clear build/publish procedure.
There are also lots of green things, green things are good. The big wins here are having two domains (token.simpletokens.app, r5k.page) on the same kubernetes instance with domain based routing, https between cloudflare and kubernetes, a WAF that limits access to each resource and a mongo DB instance with proven working role based authentication.

Juicy Deets

As a summary that’s good but this is Devember so I feel obliged to go into some nerdy detail.
We can start with Docker. Coming into this project I’d only used Azure app services before the way that works is you write your code and Azure puts it in a container for you and hosts it, making the service publicly available. Docker offers the first half of that and a little bit more. You get the zero maintenance and the added benefit of debugging your code locally on the same OS it’ll be hosted on. One of the big issues with developing in Windows and deploying to Linux is the file system difference meaning code works locally then fails when deployed. If you’re using visual studio then that mostly takes care of your docker requirements for you, with only a few small issues where visual studio seems to be unable to run docker in debug mode… for some reason.
Next is Kubernetes; Kubernetes has a reputation with anyone I’ve talked to of being overly complicated and difficult to use. From my time so far with it, I 100% agree. I don’t think I’ve encountered an application begging for an interpolation layer more than Kubernetes. Now that I have it set up everything is fine but if we look at how traffic moves through Kubernetes:


Each box here is a pod has it’s own obtuse yaml file describing it (sans the three deployment instances that only have one shared file). This is really an area where Azure taking a cut for handling this for me is nice and I hope Linode will offer something in the future so I can just pick a docker container and sub domain for routing.

The last thing I wanted to cover is the architecture of prometheus vs app insights. There are two completely different that seem to be in play here. Prometheus offer this architecture diagram:

Where as App Insights has this:


The key difference here is that prometheus is going to an exposed endpoint on the application to fetch the logs where as app insights is being sent logs from the applications. prometheus does offer a “Pushgateway” which replicates the same behaviour as app insights but they advise against using it.
I don’t know if this is bias but my gut reaction is it’s much better to receive logs than fetch them. In a cloud based world, instances are being frequently created and destroyed so it seems like an instance could be created, do some stuff then be deleted before a logging mechanism has a chance to figure out it existed. You also create a kind of backdoor into all your applications where the logging solution has access to every application and securing it just adds complexity and dependencies to your design where you want services to be as decoupled as possible.
I suppose on the flip side the app insights approach is more vulnerable to attacks. In theory anyone with your instrumentation key can send you lots of trash data that you’d end up paying for. If you had a managed instance alternative solution then that could overload your CPU.

EDIT: make the prometheus image have a background colour

Afternoon all,
Got a nice quick update for you. It boils down to “work continues”.
In a bit more detail, I have created all the repos I should need for the MVP.


Note that the logging package is red but I think by using app insights I’ll not need a logging package; I can just use the built in ILogger with .net.
The reason I don’t have much of an update is that I’ve entered the thoroughly boring bit of any “from scratch” architecture project where I have a bunch of foundational work to do to speed up later development. This week for example I created a wrapper for the Amazon S3 bucket package so I have an abstraction layer for a code once approach (Among the many other benefits of using wrappers).
This next week I’ll be working on a similar wrapper for the mongo nuget packages then the core for all the blazor websites.
I’m being very caviller about my approach to unit testing and documentation. I will tackle those probably near to the end of the whole project. I’ve never been one for test driven development but I do like quality gates on my main branches, even down to whitespace issues. I do think that in this case more features is more shiny is more better.
For anyone who wants to follow the development here are the links to all the repos:

Nuget:

API:

UI:

Work Board:

As one final note I’m on a work trip for most of next week so the chances are I won’t get much done. Rest assured that if I have another nothing burger update then I’ll just skip next week.

Hi all,
It’s been a while since I posted an update.
Progress has been slow, unsurprisingly as Christmas has approached and things open up again my personal coding projects have taken a back seat to being in the outside, with nature and family.
Regardless of my excuses I have hit a bit of a landmark. The API endpoints I need for the MVP have all been finished :tada:.
This blockchain stuff is quite complicated so I’ll go into the how at the bottom of the post.

As for shiny things the site currently looks like this:


This is the most ugly mess I’ve ever presented to a group of people but I promise that this like 60% done. Right now it still has all the blazor template stuff in so that the site isn’t empty. It’s also got the default MudBlazor (My chosen component framework) layout/styling.
The good bits are that I have a translation system, a core UI library so I can implement a micro-ui framework and Refit integration for type safe web requests. My remaining work is to take the model that I’m displaying on the screen and display it in a clean, themed way.
As for theming, I plan on running a dual theming system. A few years ago a CSS media query was introduced: “prefers-color-scheme”. I plan on using that to alternate between a light theme and dark theme based on the browser preference.
This is my starter colour palette. Just like the software development I’m sure I’ll iterate on this.

I plan on keeping the majority of the styling stock from the mudblazor library.

Now that the update is done, I can get into the main problem I encountered with NFTs. This is going to be very rambly and not to the point…
The goal of this project isn’t to build a simple page that displays an image with HTML. The goal is to have a generic wrapper APIs around the ethereum blockchain so you can fetch and create NFTs. When I created this goal I foolishly assumed that I owned something that I didn’t.
I think it’s common knowledge that an image is just some bytes of data that can be interpreted as an image. Another piece of common knowledge is that a blockchain is like a large number of computers networked together that track ownership of data. The basic assumption for NFTs would be that each image is stored on some part of the blockchain network. Looking through the store listing on Mintable you’ll learn that this isn’t the case:


That purple box on the right shows what I thought NFT really is. That is JSON, a human readable way of storing information. This is good because it means that an NFT isn’t just an image without context; Instead you get lots of nice information like an image description. It’s a bit bad because the image isn’t stored on the blockchain. It’s like any other image on the web that can be deleted if Mintable decides not to pay their bills… or just delete it for some reason… or if cloudfront.net goes out of business, changes their domain, etc.
We’ve now reached my original assumption. I assumed that this JSON file is stored on the blockchain somehow so that it will exist forever. I also assumed that the format of the JSON was following some kind of generic standard so that NFTs created on the mintable store are the same as NFTs created on opensea or some other marketplace. Unfortunately both my assumptions were wrong. The JSON for an NFT is just stored on a mintable server with a mintable set standard (Here it is if you were curious)
From a development point of view, not having a fixed standard for NFTs makes it basically impossible (I’m aware I could do something horrible to solve this) to build a generic system for displaying NFTs. Each store will require special code to parse the token information.
From an ownership point of view you may be confused because so far the blockchain hasn’t been involved in this storage process at all… But I definitely own the NFT, I can see it in my ethereum wallet. To cut to the chase what I own is the number 2738 in the very specific context of blockchain address 0x9201a886740d193e315f1f1b2b193321d6701d07. Those two things are basically the only bits of information stored on the ethereum blockchain; And I plan on getting a 2738 tattoo in honour of this revelation.
If I see any demand for it then I’ll do a post explaining how you get from the blockchain to the mintable hosted NFT metadata. This post has gone on long enough and I think I’ve done enough to convey some of the difficulties of working with NFTs. Hopefully my next update will be with a completed MVP.

Hi all,
Final update for Devember this year. I’m wrapping up to see family and enjoy some games.
I finished up with a completed site: https://r5k.page/
I also made two microservice APIs each with an endpoint and various library packages.


Anyone with an appreciation for C# will probably enjoy reading through the various repos for this project on my github: mparker42 / Repositories · GitHub
There are some cool patterns for API to API communication and development principles in play that allow development to scale without monolith overhead.

I’m happy with what I made and a bit disappointed in myself for not hitting any stretch goals around users creating their own tokens. I’ve honestly lost a lot of motivation after learning how NFTs work and a little bit from feeling like I’m posting into the void.
Next year I’ll focus on a less ambitious project so I can play a more active part in the community around the event.

1 Like