It may not be the easiest surgery in the world, but you can replace the hard drive in a Time Capsule. You'll probably want to replace the power supply too after this much time
Dreams managed to animate splats on the PS4. Admittedly, not quite the same type of splats, but there is probably a middle ground here where it can be made to work
Practically, I think the premium only makes sense if the routing layer gives you something operational: one contract/invoice, EU support/legal process, spend caps, audit logs, maybe provider fallback. If it's just a pass-through to the same US/China model endpoints with +5.5%, I don't see much reason for devs to switch on price or sovereignty grounds.
I haven't looked at whether Eden does this, but Openrouter provides a number of these, and more. I go direct to the major providers, and use OpenRouter for the smaller ones because it saves me a lot of hassle.
If Eden provides a similar feature set, I'd certainly consider them.
In a world where it enables you to tell your place of a work “just get us an account there so we have access to all models under a single billing account”.
In other words, it solves an organizational problem, not a technical one. That’s what the 5.5% is for.
Whether or not you prefer this or OpenRouter or one of the other LLM gateways is another discussion.
OpenAI and Anthropic steal my money by expiring unused API credits, which is illegal in my country. OpenRouter also has a clause to do that in their terms of service (although they haven't yet, and some employee on their discord assured me they won't). Not sure about Eden AI (they have some fishy stuff in their ToS like "Unless otherwise stated, payments are non-refundable"), but at least I could sue them without buying an international plane ticket if the need arises.
But the most important advantage is the convenience of being able to try out new models without subscribing to yet another service.
And? The point is that it's routed to the same model. Is the middleman's nationality that important, especially when you already accept the existence of a middleman?
If you're an EU business it's easier to do B2B with other EU businesses, just like it's easier for US businesses to do B2B with other US businesses. Not sure this is strange or out of the ordinary, I think it works the same in most places in the world today.
Didn't knew I had to, but just for you: Nationally it's easy, identify the customer, issue invoice then send product, pretty much it (simplified obviously). Intra-EU; same thing but some additional VIES and VAT, mostly still easy. Non-EU business, now it's basically a full export, with all the requirements and declarations that comes with.
Edit: Somehow, the comment got deleted although I replied to it? It originally implied I wouldn't be able to explain why it's easier, so I wrote the above to explain.
It's just someone you don't know who actually runs it due to no proper imprint promoting their business over someone else who you also don't know who actually runs it. So you send all your valuable business data to unknown guy A instead of unknown guy B. Oh, and also, in both cases you couldn't even sign a proper data subprocessing agreement with both guys. You can't sign it with guy A, who doesn't care, and you also can't sign it with guy B who says he's from Europe, does not even bother to provide an address to prove that, and obviously does not understand the GDPR.
Net souvereignty gain is zero by switching the middle man. In fact I'd say using such a "European" router service is actually worse than making business directly with, let's say, AWS, OpenAI or Anthropic where you'd at least know where you're buying from.
Under the circumstance where I'm looking for a "AI Gateway" (not sure why I would, but lets say) and at the same time I prefer to use EU businesses because it tends to be easier and more familiar.
What happens after the AI Gateway don't matter that much, since the whole purpose of the product seems to be about routing LLM inference requests, if it didn't do that, I don't think they'll have anything to sell in the first place :)
DonHopkins on July 12, 2021 | parent | context | favorite | on: I Stopped Using Emojis
>What we saw was, if you go too far in that [representational] direction because you want to be inclusive, people don’t see themselves represented and they’re not going to use it. You have to have enough specificity to represent you enough, but not so inclusive that your emoji palette is hundreds of thousands of emoji.
Scott McCloud wrote a whole book about this: "Understanding Comics".
>One of the book's key concepts is that of "masking," a visual style, dramatic convention, and literary technique described in the chapter on realism. It is the use of simplistic, archetypal, narrative characters, even if juxtaposed with detailed, photographic, verisimilar, spectacular backgrounds. This may function, McCloud infers, as a mask, a form of projective identification. His explanation is that a familiar and minimally detailed character allows for a stronger emotional connection and for viewers to identify more easily.
>The masking effect or masking is a visual style, dramatic convention, and literary technique described by cartoonist Scott McCloud in his book Understanding Comics in the chapter on realism. It is the use of simplistic, archetypal, narrative characters, even if juxtaposed with detailed, photographic, verisimilar, spectacular backgrounds. This may function, McCloud infers, as a mask, a form of projective identification. His explanation is that a familiar and minimally detailed character allows for a stronger emotional connection and for viewers to identify more easily.
Scott McCloud and Will Wright discussed masking and other issues in their 2002 GDC discussion, "When Maps Collide":
Understanding Comics and masking influenced The Sims 1 graphics architecture and design (using detailed pre-rendered 2d+z sprites for the environment and simplistic real time 3d graphics for the people), which fortunately ran fast on the common un-accelerated 3d graphics hardware of the time (greatly expanding the user base), and synergistically enabled user created content (which was essential to its success) which I described in this earlier post:
>Going 3D at that time in history meant that the quality of the graphic would take a huge hit, as well as the rendering speed, and fewer people would be able to run it because it would require a high end computer, so it was just not worth it.
>Using 2D pre-rendered sprites means that the artists can use as many polygons, rich textures and lighting techniques as they want in 3D Studio Max, and tweak them until the sprites look perfect, and that's exactly what the user sees. You just could not approach anywhere near that quality with 3D graphics at the time. Of course things are a lot different now!
>That was during the time that The Sims was also in development. One reason The Sims was successful is that it did not try to be full 3D, and ran well on low-end computers (the old computer that little sister inherits from big brother when he upgrades to a gaming machine). It used a hybrid 2D/3D system of z-buffered sprites, with an orthographic projection constrained to four rotations, three zooms, and only the characters were rendered with polygons into the pre-rendered z-buffered scene, using DirectX's software renderer.
>I developed the character animation system and content creation tools for The Sims, and when the EA executives were reviewing the technology to decide if they should buy Maxis, to justify our approach I bought them a copy of Scott McCloud's book Understanding Comics, which explained a concept called "masking" --
>Hergé's Tintin comics are a great example of how that works: The idea is that by making the background environment very realistic (i.e. rich pre-rendered sprites from high poly models), and the characters themselves more abstract (i.e. efficient real time 3d texture mapped low poly models), the readers (players) can more easily project themselves into the scene and identify with the characters. Much in the same way an abstract happy face can represent everyone, while a photograph of a person's face only represents that person.
>The other fortunate consequence was that it was easy for players to create their own characters and objects by editing the textures and sprites with 2D tools like Photoshop, without requiring difficult 3D modeling tools like 3D Studio Max, so that enabled a lot of user created content by kids instead of professional artists, which was essential to the success of the game.
IBM employees have garnered six Nobel Prizes, seven Turing Awards,
20 inductees into the U.S. National Inventors Hall of Fame, 19 National Medals of Technology,
five National Medals of Science and three Kavli Prizes. As of 2018,
the company had generated more patents than any other business in each of 25 consecutive years.
> the company had generated more patents than any other business in each of 25 consecutive years.
A couple things about those patents, from a former IBMer who has quite a few in his time there.
First, not all patents are created equal. Most of those IBM patents are software-related, and for pretty trivial stuff.
Second, most of those patents are generated by the rank and file employees, not research scientists. The IBM patent process is a well-oiled machine but they ain't exactly patenting transistor-level breakthroughs thousands of times a year.
Why do you need to generate transistor-level breakthroughs multiple times a year? Those breakthroughs are hard to generate, but they're important and industry-spanning. The problem is we've mostly stopped generating them.
I wasn't saying anything about that, I was just pointing out that yes, IBM produces a ton of patents, but they're mostly trivial junk that regular employees generate en masse in order to earn accomplishments and make up for the insultingly low bonuses.
> they're mostly trivial junk that regular employees generate en masse in order to earn accomplishments and make up for the insultingly low bonuses
We did that at Meta and Amazon too (for polycarbonate puzzle pieces, with no monetary award at all!). Every now and then something meaningful came out of it
I also worked (briefly, as an intern) at IBM and IBM’s management also sometimes undermined the R&D that happened at the company.
I started at the tail of one research group’s mass exodus. It was like a bomb had gone off; the people left behind were trying to pick up the pieces. In essence, this group developed a sophisticated new technique, which the company urged them to commercialize. Pivoting to commercialization was a big effort, and not naturally within the expertise of this group, but they did it, largely at the expense of their own research productivity—for several years. They even hired programmers (ie, not people who are primarily computer scientists) and got it done. But just before launch, IBM pulled the plug.
This infuriated the researchers in the group. Keep in mind that career advancement in research is largely predicated on producing new research. In effect, IBM asked people to take a time out and then punished them for agreeing to do it. The whole group was extremely demoralized. Google was the largest beneficiary of this misstep.
I also had a similar, frustrating experience working for Microsoft, so it’s not just IBM, but the same dynamics were at work: bean counters asking researchers to commercialize something and then axing a project as it becomes deliverable.
If AI replaces any role in the company of the future, please let it be the managerial class.
The thing is, Nobel Prizes and other awards don't pay the bills.
Patents do, but in most cases it's trivial patents or patents for a "mutually assured destruction" portfolio (aka, you keep them in hand should someone ever decide to sue you).
That's a fundamental problem with how the Western sphere prioritizes and funds R&D. Either it has direct and massive ROI promises (that's how most pharma R&D works), some sort of government backing (that's how we got mRNA - pharma corps weren't interested, or how we got the Internet, lasers, radar and microwaves) or some uber wealthy billionaire (that's how we got Tesla and SpaceX, although government aids certainly helped).
All while we are cutting back government R&D funding in the pursuit of "austerity", China just floods the system with money. And they are winning the war.
mRNA is not a good example. If anything, it's a demonstration of why the Western capitalist model is superior to anything else. Most of the mRNA research was funded by venture capital as a high-risk high-reward investment.
In the world of government-sponsored research, mRNA likely would have been passed over in favor of funding research with more assured results.
Every year they grant prizes. If hardly anyone is doing core R&D because of cost cutting, there is a higher chance those doing the smallest amount of R&D get the prizes.
A Nobel in 2026 doesnt carry the same weight as a Nobel in 1955.
> If you as a reviewer spots things after the implementation rather in the discussion beforehand, that ends up being on both of you, instead of just the implementer who tried to move along to finish the thing
This is accurate, but it's still an important check in the communication loop. It's not all that uncommon for two engineers to discuss a problem, and leave the discussion with completely different mental models of the solution.
reply