I am roaming on Orange network here in Barcelona using a three.co.uk SIM card. Working quite well , 4.18Mbps down and 2.85Mbps up. I am testing inside a hotel using iPad Air 2019 and Blackberry Passport as mobile hotspot,
Tsubasa Kato here, and I just came back from Starbucks, doing some study on information retrieval etc.
I was checking some things like Browse project, Tapestry project by Xerox PARC, SIFT (Stanford Information Filtering Tool) etc. I then headed onto Bing, and looked up "memex research notes", and to my surprise, it showed up a tool called WordBrain's memex.
It's a browser extension, and also has a Dashboard. I just started using it, and it's a little like Evernote and Pocket except it stores data offline until you upload it on your own to their server.
This extension is a good inspiration for my Thought Remix project going on slowly.
So, the other day I finished taking a course on a MOOC course. I am working on the part two of the course, and will try to finish it soon.
I also bought a tiny drone called G FORCE SQUARED, which has protective net so it doesn't hurt the blade when I accidentally crash it. I flew it outside yesterday, but the wind was too strong for my drone. It does say it is recommended for indoor use. (The drone operates via radio, not IR btw. )
I also cleaned up a lot of books that I gathered over the years, and the local used book store bought it at a fair price. Books take up too much space! lol
I'm going to be going out to work on something, then send off an order I received on eBay.
I was playing around with an Pocket PC application on my EMONSTER S11HT. The EMONSTER S11HT is a variant of HTC Tytn II. I have 2 x of EMONSTER S11HT, with one as a backup (a little broken, for parts currently).
As you can see, HYP, SIN, COS, TAN, LOG, LN, RND, ASIN, ACOS, ATAN, Pi, etc. are supported.
I downloaded the ARM Pocket PC version of the application, and it worked without any hassle.
This is getting interesting, or perhaps a little scary.
"We’ve trained a large-scale unsupervised language model which generates
coherent paragraphs of text, achieves state-of-the-art performance on
many language modeling benchmarks, and performs rudimentary reading
comprehension, machine translation, question answering, and
summarization — all without task-specific training." ...
"GPT-2 is a large transformer-based language model with 1.5 billion parameters, trained on a dataset
of 8 million web pages. GPT-2 is trained with a simple objective:
predict the next word, given all of the previous words within some text.
The diversity of the dataset causes this simple goal to contain
naturally occurring demonstrations of many tasks across diverse domains.
GPT-2 is a direct scale-up of GPT, with more than 10X the parameters
and trained on more than 10X the amount of data."