Hacker Newsnew | past | comments | ask | show | jobs | submit | thegoleffect's commentslogin

As in the dependencies you add don't run their postinstall scripts or your app's package defines a postinstall and that script doesn't run?

The latter should work but the former requires an extra step for now iirc: https://bun.sh/docs/cli/install#lifecycle-scripts


The former, specifically Electron. Adding it to "trustedDependencies" didn't seem to help.

https://github.com/oven-sh/bun/issues/1588


A custom, DIY smart monocle from off-the-shelf parts, 3d printing, and custom electronics: 1080p60hz, 8-11hrs of battery life on a belt-clip battery + computer combo, has wifi & lte/cellular, can run ML models on device. One third the weight of upcoming Apple AR/VR glasses and one-sixth the cost. Just having it working has increased my efficiency a ton without obstructing vision or requiring me to look at secondary monitors or phone.

Working on replacing my wireless keyboard and trackpad with some "gloves" so I can use it while on hikes or just generally outside. Then, gonna integrate some custom AR and ML/GPT.


Care to share more details on how you're achieving that?


Which part? The glass is an Epson BT-40 cut in half with some soldering to bypass needing both eyepieces; this cuts the weight & power consumption by ~40%. Mounted onto a printed carbon-fiber nylon frame similar to bone conduction headphones. The computer is a single-board computer I had lying around, but I will upgrade to 12-core 30W SBC next week. The battery is one made for video cameras, and I gave it a belt clip and strapped the SBC onto it. The SBC has camera & mic inputs as well as GPIO for whatever I want to add.


Zak ?

Edit: my bad... the cut in half BT-40 reminded me of someone else...


The Ultra 2020 edition and onwards use the Realtek RTD1319 for av1 decoding.

(disclaimer: I used to work there)


I am so curious about the Roku and what is under the hood. They really do have a lot of value for the cost. Quirky at times, though.


Thank you just ordered the device.


Congrats on the launch Shah! I can tell you stayed up late giddy for this launch :D. As another peer building for video creators, I am delighted to see more efficiency features like this released.

This approach was the one I tried first also (I also tried the frequency one fwiw, which has its own, worse drawbacks). But using loudness runs into issues if the source loudness isn't (relatively) even across the entire source media. Using a single sensitivity setting like this would be a problem if:

* recording gain is set to automatic, and there are sudden changes in noise floor like wind (if recorded in 24-bit or lower)

* crew adjusts gain partway through recording (big no-no but happens)

* talent/host moves in and out of microphone sweet spot

* talent/host adjusts themselves in a squeaky chair during silence or transition-to-silence (or coughs, or breaths loudly, or ambulance goes by...)

If you apply the edit w/ a single sensitivity and something like the above is true, it would cut in the wrong place. Unfortunately, you would have to watch the entire show, skipping to boundaries with your full attention to know that ever got a cut wrong.


The single-level approach is what Recut does too, and it tries to take a guess at a threshold with clustering but it's not always perfect. Maybe a better way to go would be a dynamic noise gate or kalman filtering or something.

Vidbase is looking awesome btw! I bet it's going to be huge. It looks like you've paid an insane amount of attention to the details.


this is super useful insight for us, thank you for sharing. yeah another product we're working on is "auto audio leveling", which I hope solves some of this, but we'll see.

and yes, I was very excited, thank you for checking it out, Van!


The black rectangles exist because the video is a different aspect ratio than your monitor. Not all displays have the same ratio, a monitor with a matching ratio will have no black bars on any side. So by nature, closed captioning has to be within the video bounds.


I'm aware of the reason for the black bars. The point is that the captions that are not part of the video stream (I believe on DVDs they are not) can be presented below the video if that space exists and the user desires.


That feature is called "subtitle shift", some DVD/BD players have it (and VLC IIRC). Unfortunately streaming clients generally don’t.


An offline-first web version of AE/Prem is what we're working on @ Vidbase. Internally, it works today better than the desktop apps imo. Others are working on similar tools as well.


Yeah, most of the film was shot on a 9.8mm Kinoptic.


Pretty amazing. I've never seen b+w footage in a film have that sort of distortion in that duration.


I no longer see the warning on the readme, but, this relies on SharedArrayBuffer, so it is not currently supported on mobile (except Firefox for android) and some other browsers: https://caniuse.com/sharedarraybuffer


I get this when I visit the OP link:

> Your browser doesn't support SharedArrayBuffer, thus ffmpeg.wasm cannot execute. Please use latest version of Chromium or any other browser supports SharedArrayBuffer.


According to caniuse some headers are necessary for it to work on Firefox. I guess the developer has to fix the demo.


In fact, the header must be set in the server-side, which means I cannot do that as I am using github pages. You can check more details here: https://github.com/ffmpegwasm/ffmpeg.wasm/issues/102


I noticed you were using github and completely failed to think of that! My bad.


It is on the last line of the Installation section, but yes, ffmpeg.wasm still replies on SharedArrayBuffer for multi-threading.


Most of the devs quit WalmartLabs a long time ago (including myself, Eran, etc). Sponsored in the sense that WalmartLabs paid us to use it to build their services and we developed hapi.js/joi to support our jobs.


I'm genuinely curious if anything produced by Walmart Labs had any sort of "commercial" success or was even adopted within the main Walmart ecosystem. I've certainly heard of hapi, but don't know if it ever gained all that much adoption. Or was it mostly a recruiting tool to make Walmart attractive to a better class of devs?


While I was there, hapi.js was extensively used by the Global eCommerce department and was responsible for fronting the entire mobile API. You can check Eran's blog for stories of how it handled all of Walmart's mobile API traffic especially during the massive thundering herd of Black Friday traffic. It was used on many other projects including some "big name" projects, however, I don't know if hapi's involvement was made public for those so can't name them directly.


Thanks for update and nice work for the community. I didn't end up taking job at WL, but Hapi greatly increased my technical opinion of them and was valuable for recruiting devs too.


This looks great but unfortunate naming means SEO will always compete against the flow type checking library even if you google "react-flow".


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: