# Tokenizer

I had the idea after seeing an IRC channel topic asking people to use
a pastebin with an example using netcat:
`command | nc somepastbin.com 9999`

## Flexibility
Initially I thought about just copying the pastebin functionality but
then I realised it could easily be used for URL shortening too.

The basic premise is to pass some data to the server in return for a
token, which can later be exchanged for the stored data.

There are a number of options to extend functionality:
- Decide how long the data is retained
- Count how many times the token is accessed
- Limited access tokens (disable/remove the token after X clicks).

## Implementation
I wrote a perl module, Token.pm, which handles storing & retrieving
of data and token creation.

I then created a basic server to listen for data on a specific port
which returns a URL containing the token in reply.
If the data passed begins with http:// or https:// then it returns a
URL which provides a redirect to the URL. Otherwise it returns a URL
to view the data as plain text which can either be retrieved with
curl or by viewing the URL in a browser:

Example:
`#echo "This is the cats meow!" | nc MY-IP-ADDRESS 2020`
`https://MY-IP-ADDRESS/V/yjOLc2`

`#echo "http://google.com" | nc MY-IP-ADDRESS 2020`
`https://MY-IP-ADDRESS/R/ztbvx2`

The final part was to create a cgi script that would return the data
or perform a redirect when viewed in a browser.
Ideally this should be hosted on a very short domain name.

`#curl -s http://MY-IP-ADDRESS/V/yjOLc2`
`This is the cats meow!`

## Status
At the moment it works locally, providing URLs for viewing / redirect.
It also stores token creation date and counts the number of times it
has been accessed.

UPDATE 11-12-2020:
I have extended the functionality to include viewing with gopher and
keeping a count of how many times a token/link is accessed. I also
shortened the https:// URL's as much as I could but ideally need a
short cheap domain name.