#|
I haven't done deep learning before. I know a little about it anyway. A research
of twitter's seemed to have attractive properties, and I thought I would wade in
and take it.

Actually, I started out exploring creating a system that would extract intent
from the cpu torch python.

However, the 500 lines of #p"src/run_GNN.py" boil down to
|#
(defun train ()) ; elided for now, but is instructive

(defun deep-learn (trainer model optimizer data test-acc
                  &key (number-epochs 1500))
(declare (ignorable x loss))
(loop for x below number-epochs
 this-time = (get-internal-real-time)
 for loss = (funcall trainer model optimizer data)
 do (multiple-value-bind (train-acc val-acc test-acc)
                         (funcall test-acc model data)
     (prin1 `(,this-time ,train-acc ,val-acc ,test-acc)))))

(let ((model (gnn command-line-options data))
     (optimizer (get-optimizer command-line-options (parameters model))))
(deep-learn #'train model optimizer (get-dataset path)))

#| And a list of default parameters to pass to an optimizer-getter.

Since the file was about trying different command line arguments on a python
script, and what I wanted was one base case, the file isn't interesting; I guess
it would be nice to scoop out some default values.

I don't think the software engineering of this file, nor python, was insightful.
Well, let's see what I can get for 'gnn 'get-optimizer 'trainer and 'test-acc !
|#