Almost all my computer science students are using Typst on my recommendation to write up their programming projects, vs most using Microsoft Word last year. Specifically, writing in VSCode with the Tinymist Typist extension. All going very well so far and no complaints.
I was with you when you explained how you got them off of MS Word (even worse if it was the web version!!), until you brought up VSCode. I get it, you are probably advocating very practical choices. Just that I wouldn't recommend students to become dependent on VSCode (due to its vendor and "telemetry"). Basically, that makes the students again dependent on MS. But I guess since you recommend an extension, you cannot recommend VSCodium and adding the extension store as someone in a teaching position? If you can, then I would strongly advocate recommending VSCodium instead (or non-electron app editors), to avoid spyware on their machines and to promote a healthier tooling.
I have not tried this one, but the predecessor (typst-lsp) was working fine with Neovim for me when I did my little Typst experiment last year. Whatever the VSCode plugin is offering should be possible to replicate on top of tinymist.
I'm writing this on a Unihertz Jelly Star which is tiny, and I consider it my "protest phone" at the lack of decent small phones.
A friend jokingly calls it my "microphone", another a "prison phone" (due to its size allowing for more easily smuggling in body cavities). Occasionally I go to mobile phone shops and ask if they have a case for it just for the fun of seeing the look on their faces when they see it (I don't actually want a case, and in fact it came with one which I threw in the bin).
Personally, I couldn't be happier with it.
Only problems: they don't do software updates; camera is poor; non-OLED screen.
In an ideal world I'd have a slightly bigger phone, but not too much bigger. I've grown very fond of this phone.
The lack of updates/general software sketchiness is what has me turned off from the Jelly. I know a product like that never has a chance in hell of running Graphene, I’d be way more interested if it could run Lineage.
Is OLED burn-in really something people still care about? I have a handful of OLED devices, some of which I've used daily for nearly 10 years, and none of them have any burn-in. I've never even seen burn-in on anything other than a signage TV, and that happens even on some LCDs.
> I've never even seen burn-in on anything other than a signage TV, and that happens even on some LCDs
AFAIK, the hardware still suffers from that problem, but it's been fixed in most devices by software fixes. Instead of displaying the exactly same content 24/7, it has "cleaning programs" or similar to runs once in a while to prevent the burn in from happening. Our OLED TV does the same I think too.
Of course, and many devices also use various pixel shift techniques. My point is that this isn't really a drawback from the user's perspective. Saying "I consider non-OLED to be a selling point because it won't burn in" simply doesn't make sense anymore.
I know someone who spends so much time with YouTube on their phone that the logo is visibly burned in to the screen. The phone is less than 2 years old.
Samsung smartphones are everywhere here and I've never seen a burn-in screen. Is there a difference between OLED and AMOLED? I thought that AMOLED was just a "flavor" of OLED.
The Jelly Star's battery life is surprisingly decent for its size - I get about 8 hours of moderate use, but it requires a mid-day charge if you're using GPS or watching videos.
The Jelly Max is 5" (so bigger than previous Jellies, smaller than mainstream phones). I'd strongly consider one except for their lack of software updates.
Yeah not ideal… then again… samsung actually pushed an update with an overlay that shows ads on some old models. I had one I used to show the forecast, attached to a wall. I had to reset and to factory and forbid all updates or ads would appear on top of my app.
I loved tixy when I first discovered it a few years ago so created this https://www.mathsuniverse.com/tixy (with permission from the original author) with puzzles to solve on the tixy grid. I use it with my computer science students who get really into it.
I was blown away by the little functions at first and I too made a clone to experiment with calculang [1].
I added an evaluation feature (F9) so you can select sub-expressions and see what they do, which was helpful to figure out some patterns (video in [2])
What's wrong with that statement? It has historically and traditionally been true for raster displays, even if there do exist ways to use standard Cartesian-style coordinates with a computer.
There top left has usually been (0, 0) for hardware pixel coordinates (although even then there’s plenty of exceptions, e.g. mode 13h scrolling) but as a blanket statement about computer graphics in general it’s misleading.
I'm struggling to see the problem with this statement, other than maybe to add in the word "usually". My students will know of graphs in maths where the origin is always bottom left. When working with HTML canvas and every other computer graphics situation I've worked in, it's top left instead.
"PostScript uses a coordinate system where the origin is at the bottom-left corner of the page, with the x-axis increasing to the right and the y-axis increasing upwards."
Oscilloscopes use middle-left.
Unreal engine and SketchUp use Screen middle with xy increasing to the right.
in AutoCAD, the user coordinate system is 1/3 of the screen to the left for the origin, with X increasing to the right, and Y increasing upwards.
Almost all raster displays, and memory based programs assume top left, because that is how it was done first - counter intuitive.
It it not counter intuitive and the decision extends far earlier than the first displays.
A raster image onscreen is displayed in the order that the data appears when written down. It stands to reason that a data depiction should be in the same orientation as the display orientation. Displays were created by people who read from left to right, top to bottom. If the displays did not follow that order. images would be flipped or rotated when displayed in a data form.
The first pixel written to the display is in the top left because we read from the top left. If writers of another language had have popularised the text, perhaps things might have been different.
A guerilla marketing plan for a new language is to call it a common one word syllable, so that it appears much more prominent than it really is on badly-done popularity contests.
Call it "Go", for example.
(Necessary disclaimer for the irony-impaired: this is a joke and an attempt at being witty.)
Amusingly, the chart shows Rust's popularity starting from before its release. The rust hype crowd is so exuberant, they began before the language even existed!
I'm not so sure, while Java's never looked better to me, it does "feel" to me to be in significant decline in terms of what people are asking for on LinkedIn.
I'd imagine these days typescript or node might be taking over some of what would have hit on javascript.
Recruiting Java developers is easy mode, there are rather large consultancies and similar suppliers that will sell or rent them to you in bulk so you don't need to nag with adverts to the same extent as with pythonistas and rubyists and TypeScript.
But there is likely some decline for Java. I'd bet Elixir and Erlang have been nibbling away on the JVM space for quite some time, they make it pretty comfortable to build the kind of systems you'd otherwise use a JVM-JMS-Wildfly/JBoss rig for. Oracle doesn't help, they take zero issue with being widely perceived as nasty and it takes a bit of courage and knowledge to manage to avoid getting a call from them at your inconvenience.
Speaking as someone who ended up in the corporate Java world somewhat accidentally (wasn't deep in the ecosystem before): even the most invested Java shops seem wary of Oracle's influence now. Questioning Oracle tech, if not outright planning an exit strategy, feels like the default stance.
Most such places probably have some trauma related to Oracle now. Someone spun up the wrong JVM by accident and within hours salespeople were on the phone with some middle manager about how they would like to pay for it, that kind of thing. Or just the issue of injecting their surveillance trojans everywhere and knowing they're there, that's pretty off-putting in itself.
Which is a pity, once you learn to submit to and tolerate Maven it's generally a very productive and for the most part convenient language and 'ecosystem'. It's like Debian, even if you fuck up badly there is likely a documented way to fix it. And there are good libraries for pretty much anything one could want to do.
a) Does your query for 'JS' return instances of 'JSON'?
b) The ultimate hard search topic for is 'R' /
'R language'. Check if you think you index it corectly. Or related terms like RStudio, Posit, [R]Shiny, tidyverse, data.table, Hadleyverse...
As a teacher of computer science, I'm working on various free resources for teachers and students, such as a guide to simulating forces and collisions in JavaScript (https://www.mathsuniverse.com/particles) and super simple real-time forms for lessons (https://www.mathsuniverse.com/forms). What brings me joy is seeing that they are actually useful for people other than just me.