Hacker Newsnew | past | comments | ask | show | jobs | submit | muunbo's commentslogin

Hi hackers, my friends recently asked me to make a tutorial showing how to set up their phone with adblockers and dark mode. So I did! Hope it helps you browse the web more calmly. Below I copied my youtube blurb so you know what the video is about (pls don't mind the overly compelling/engaging text copy).

Take control of your iPhone and stop letting websites overload your brain. In this short video, I show you 3 free iPhone hacks that instantly make web browsing calmer, cleaner, and less distracting.

You’ll learn: • How to install a powerful Safari ad blocker • How to watch YouTube ad-free • How to force dark mode on every website to protect your sleep and avoid melatonin disruption at night


Hi hackers, I made a tutorial and video on 2 different ways (GUI and CLI) of installing MicroPython on an ESP32. Hope it's helpful to those of you who want to try out hardware/embedded projects while leveraging your Python skills (and not having to learn Arduino or C/C++). Feel free to me ask any questions/clarifications here if you'd like :)


Whilst I am sure I probably understand it, I am not clear what you mean by "learning Arduino"?


One important aspect missing is tactical feedback. The mouse, keyboard, and touchscreen all provide that but Vision Pro doesn’t. We have evolved several nervous system sensors (touch, pressure, temperature) for this and it is just as “human” as spatial reasoning and hand manipulation of objects


That combined with a noisy signals from inferred controls with eye and hand tracking will frustrate a lot of people. Our current interfaces are extremely precise in comparison.

I quickly narrowed down on VR experiences that provide a decent feeling of facsimile to real life being the most interesting and engaging. Beat Saber is a case in point here but for me it was combat flight sims using stick, throttle and pedals. Moving to gestures and eye tracking wrecks all of that.

I also don’t know why the original article calls using a mouse indirect as if we aren’t masters at that as well. As well as a spatially aware species we are also a tooling using species. We are experts at leveraging our spatial reasoning to manipulate tools in the world to do things we otherwise couldn’t. A mouse movement in the world translating to moving a cursor on screen is a perfect fit.


There is no tactile UI feedback in VPro, but then neither is it there in the mouse/keyboard. Otherwise when you touch your fingers against each other for some gesture, you feel them.


Physical objects being physical things that you can touch is the feedback. Mass, inertia, and friction


But the UI isn't physical, hence my point - there is no feedback from dragging a Window, your hand doesn't get heavier because you're now holding a Window with mass/inertia/friction.

You have those when moving a mouse while holding a button, but that's not related to UI, you could just as well use it to do nothing, feedback is the same - NOT from the UI

Otherwise your fingers have the same things, so any gesture involving those is feedback just like the one you get if you touch non-finger object like a mouse


You can feel that you've clicked a button (mouse, keyboard or whatever), but you can't feel that the camera has registered your pinch gesture.


There is no difference. You can't feel the part of the computer system that registers those clicks, the same as with the camera. Maybe your mouse driver registered a double click due to a faulty sensor, maybe it registered nothing for the same reason, maybe you bluetooth mouse is disconnected or out of juice, so nothing resisters.

All of this is exactly the same - you have no tactile UI feedback


Typing on a glass screen is different than typing on an actual keyboard.


yet it's all the same category, which also exists in VPro, so there is no difference


The mouse button, like most buttons, is "clicky" specifically for tactile feedback.


you've skipped a word there after tactile, which explains why you don't get it


The likelihood of a mouse button tactile feedback not matching what the computer actually sensed is much lower than the likelihood of a camera not picking up your pinch-touch gesture.


There is tactical feedback on the Vision Pro for selecting, scrolling and moving.

I suspect that we are going to see most apps centre around this finger/thumb motion.


Do you and op mean tactile?


Yes whoops tactile


I totally agree with the “undemanding Canadians” view…so true! I moved here 12 years ago from Dubai and the demands/standards of business and service are way lower here. Pros are that it’s laid back and chill, cons are that we are unlikely to lead and innovate because we are satisfied with the status quo


[flagged]


Whoa where did this come from?


Right? Idk where he gets the idea that being an asshole to service workers leads to innovation


My mind is so friggin blown, github will run arbitrary cron jobs for you?! Can't believe other services make you pay for that.


That's true, concurrency is becoming an important consideration and is probably why predictable and FP-oriented langs like Elixir, Rust, Clojure are having good growth.

As for mutating state, I come from an embedded software (microcontrollers) background haha and there's no option but to mutate state


That's a really cool example of how it can work behind the scenes, thanks


Yes lazy evaluation is a good one. I believe Python does have that with its generators/yield syntax, but I found lazy evaluation simpler to work with in Haskell


Ah ok, thanks for clarifying that


Damn, that’s a really detailed exposition of the whole mechanical Turk economy


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: