nybble41

@nybble41@programming.dev
0 Post – 52 Comments
Joined 11 months ago

The EULA also prohibits using Nightshade "for any commercial purpose", so arguably if you make money from your art—in any way—you're not allowed to use Nightshade to "poison" it.

Allegories aside, the Bible definitely has a few LGBTQ characters, even if they're not portrayed in a very positive light. I suppose that means they'll be banning the Bible from school libraries? Not to mention a fair amount of historical literature… including anything featuring Leonardo da Vinci, Florence Nightingale, King James (yes, that King James), William Shakespeare, King Richard I, or Julius Caesar.

It will be interesting to see whether this makes the history classes easier, for lack of material to cover, or harder, for lack of references.

I'm fairly certain that last one is UB in C. The result of an assignment operator is not an lvalue, and even if it were it's UB (at least in C99) to modify the stored value of an object more than once between two adjacent sequence points. It might work in C++, though.

It is not true that every node is an exit node in I2P. The I2P protocol does not officially have exit nodes—all I2P communication terminates at some node within the I2P network, encrypted end-to-end. It is possible to run a local proxy server and make it accessible to other users as an I2P service, creating an "exit node" of sorts, but this is something that must be set up deliberately; it's not the default or recommended configuration. Users would need to select a specific I2P proxy service (exit node) to forward non-I2P traffic through and configure their browser (or other network-based programs) to use it.

9 more...

No, that's not how I2P works.

First, let's start with the basics. An exit node is a node which interfaces between the encrypted network (I2P or Tor) and the regular Internet. A user attempting to access a regular Internet site over I2P or Tor would route their traffic through the encrypted network to an exit node, which then sends the request over the Internet without the I2P/Tor encryption. Responses follow the reverse path back to the user. Nodes which only establish encrypted connections to other I2P or Tor nodes, including ones used for internal (onion) routing, are not exit nodes.

Both I2P and Tor support the creation of services hosted directly through the encrypted network. In Tor these are referred to as onion services and are accessed through *.onion hostnames. In I2P these internal services (*.i2p or *.b32) are the only kind of service the protocol directly supports—though you can configure a specific I2P service linked to a HTTP/HTTPS proxy to handle non-I2P URLs in the client configuration. There are only a few such proxy services as this is not how I2P is primarily intended to be used.

Tor, by contrast, has built-in support for exit nodes. Routing traffic anonymously from Tor users to the Internet is the original model for the Tor network; onion services were added later. There is no need to choose an exit node in Tor—the system maintains a list and picks one automatically. Becoming a Tor exit node is a simple matter of enabling an option in the settings, whereas in I2P you would need to manually configure a proxy server, inform others about it, and have them adjust their proxy configuration to use it.

If you set up an I2P node and do not go out of your way to expose a HTTP/HTTPS proxy as an I2P service then no traffic from the I2P network can be routed to non-I2P destinations via your node. This is equivalent to running a Tor internal, non-exit node, possibly hosting one or more onion services.

7 more...

They could stick to public domain & indie titles. They won't, but they could.

2 more...

The ubuntu:24.04 Docker image is only 77.30 MiB.

alpine:3.19.0 is 7.38 MiB.

Of course those sizes are without a kernel. Typical everything-included distro kernels are generally a few hundred MiB as they include drivers for everything that might be needed, but a custom build for known hardware can reduce that to just a few MiB.

Cx File Explorer supports SMB, FTP, SFTP, and WebDAV remotes out of the box. There is an option to browse the local network.

Look up the legal principle of estoppel. In general you can't turn around and sue someone for doing something after informing them (in writing no less) that you're okay with it, even if you would otherwise have had a valid basis to sue.

It is just as ridiculous that Republicans in California have little say in the presidency as Democrats in Wyoming.

The Republicans in California have a better chance of seeing a Republican president with the electoral college than they would with a national popular vote, even if their particular votes carry less weight. In a sense that gives them more representation in the end, not less—their voices are ignored but they get what they wanted anyway.

The mother had a claim because the house was literally given to her, which was the right of it's previous owner.

This person has no claim.

If the previous owners wanted it to remain with the family line they should have formalized that by placing the house in a trust.

That part is messed up. You shouldn't be dealing with individual contractors as a patient. All billing should go through the hospital, and be considered in-network provided the hospital is in-network, regardless of what kind of specialist sees you there. Any exception, such as bringing in someone who doesn't normally work there to treat a rare condition, should require separate and specific authorization from the patient in advance.

When you have an actual functioning competitive market the money you bring in correlates with the value of the service you provide, so it makes perfect sense to be happy about the money the new surgical center is bringing in. That means it's useful.

The problem is that the health care market is regulated and subsidized in so many ways, many of them conflicting with each other, that competition is very limited and price discovery is reduced to "whatever the patient (and their insurance) can afford to pay" since they can't go anywhere else. Fix that and there won't be any reason for hospital owners or employees to feel guilty about making money.

Personally, I'd love it if Democrats became the right-most party by staying exactly as they are, and a new party breaks off of them or evolves out to their left.

I'd say it's more likely to go the other way, with the more moderate or right-leaning Democrats breaking off to form their own party and perhaps steal away the more moderate Republican voters. There are a lot of voters who would naturally align more closely with traditional Republican political views voting Democrat only because the Republican party has been taken over by a radical faction. Having laissez-faire fiscal conservatives and outright socialists in the same party isn't really sustainable long-term; there are too many critical points of disagreement.

What "increased risks as far as csam"? You're not hosting any yourself, encrypted or otherwise. You have no access to any data being routed through your node, as it's encrypted end-to-end and your node is not one of the endpoints. If someone did use I2P or Tor to access CSAM and your node was randomly selected as one of the intermediate onion routers there is no reason for you to have any greater liability for it than any of the ISPs who are also carrying the same traffic without being able to inspect the contents. (Which would be equally true for CSAM shared over HTTPS—I2P & Tor grant anonymity but any standard password-protected web server with TLS would obscure the content itself from prying eyes.)

5 more...

So you're not remapping the source ports to be unique? There's no mechanism to avoid collisions when multiple clients use the same source port? Full Cone NAT implies that you have to remember the mapping (potentially indefinitely—if you ever reassign a given external IP:port combination to a different internal IP or port after it's been used you're not implementing Full Cone NAT), but not that the internal and external ports need to be identical. It would generally only be used when you have a large enough pool of external IP addresses available to assign a unique external IP:port for every internal IP:port. Which usually implies a unique external IP for each internal IP, as you can't restrict the number of unique ports used by each client. This is why most routers only implement Symmetric NAT.

(If you do have sufficient external IPs the Linux kernel can do Full Cone NAT by translating only the IP addresses and not the ports, via SNAT/DNAT prefix mapping. The part it lacks, for very practical reasons, is support for attempting to create permanent unique mappings from a larger number of unconstrained internal IP:port combinations to a smaller number of external ones.)

I'd settle for just the limits, personally.

The part that makes me the most paranoid is the outbound data. They set every VM up with a 5 Gbps symmetric link, which is cool and all, but then you get charged based on how much data you send. When everything's working properly that's not an issue as the data size is predictable, but if something goes wrong you could end up with a huge bill before you even find out about the problem. My solution, for my own peace of mind, was to configure traffic shaping inside the VM to throttle the uplink to a more manageable speed and then set alarms which will automatically shut down the instance after observing sustained high traffic, either short-term or long-term. That's still reliant on correct configuration, however, and consumes a decent chunk of the free-tier alarms. I'd prefer to be able to set hard spending limits for specific services like CPU time and network traffic and not have to worry about accidentally running up a bill.

Not every work produced by a LLM should count as a derivative work—just the ones that embody unique, identifiable creative elements from specific work(s) in the training set. We don't consider every work produced by a human to be a derivative work of everything they were trained on; work produced by (a human using) an AI should be no different.

Most of this is personal opinion and snobbery that I can't do much about except maybe ask that you examine how anarcho-capitalist your takes sound.

Objectivist, perhaps. They're the ones who obsess over controlling and monetizing free external benefits. There is no copyright in anarcho-capitalism (including "moral rights" etc.) so the GP doesn't sound at all anarcho-capitalist while arguing for infringement of others' real property rights to prop up their own artificial (non-rivalrous) "intellectual property" rights.

Sure, they don't rule the world. They only have the power to ban you (either the company per se or its individual owners, officers, and/or employees) from ever again doing any business in the EU. Which naturally includes business with any individuals or companies either based in the EU (as a seller or a buyer) or wanting to do business in the EU. Or from traveling to the EU, whether for business or personal reasons. Little things like that. Nothing too inconvenient. (/s)

They haven't taken things quite that far—yet. But they could. It's dangerous to assume that you can ignore them without consequences just because your company doesn't currently depend on revenue from EU customers. The world is more interconnected than that, and the consequences may not be limited to your company.

Not the GP but I also use tmux (or screen in a pinch) for almost any SSH session, if only as insurance against dropped connections. I occasionally use it for local terminals if there is a chance I might want a command to outlive the current graphical session or migrate to SSH later.

Occasionally it's nice to be able to control the session from the command line, e.g. splitting a window from a script. I've also noticed that wrapping a program in tmux can avoid slowdowns when a command generates a lot of output, depending on the terminal emulator. Some emulators will try to render every update even if it means blocking the output from the program for the GUI to catch up, rather than just updating the state of the terminal in memory and rendering the latest version.

2 more...

The full email address syntax described in the RFC cannot be precisely matched with a mere regular expression due to the support for nested comments. The need to track arbitrarily deep nesting state makes it a non-regular language.

If you remove the comments first the remainder can be parsed with a very complex regex, but it will be about a kilobyte long.

CVS and E*Trade both refused to accept my fairly standard user@mydomain.info address during initial registration, but had no issue changing to that address once the account was created. It would be nice if their internal teams communicated a bit better.

It would be a nominal charge for storage, bandwidth, and indexing. Book stores carry public-domain titles, for profit, and most have no issue with that. You can always procure the same files somewhere else—they are public domain, after all. Those who pay are doing so for the convenience, not because they're forced to.

Bingo. They could do it themselves, but they want to spend other people's money, not just their own. Same as any other tax. Bonus PR points for appearing "generous".

Cx File Explorer has a similar feature, along with a built-in FTP client. Another option would be to run an SSH server like SimpleSSHD on the device you want to share files from so you can access them via SFTP, which Cx File Explorer also supports. This permits more secure public key-based authentication rather than just a password.

The monopoly issue won't be resolved so long as there is artificial exclusivity over the content, i.e. copyright. That's the most critical monopoly of all. Different streaming services can't compete on how good they are at streaming because their content isn't interchangeable; you can't just swap one show for another even when they're similar in style and production quality.

The absolute minimum requirement to resolve this would be obligatory "reasonable and non-discriminatory" mechanical licenses allowing any streaming service to stream any content on equal terms regardless of source.

You misunderstood. It's not a middle ground between "can regulate" and "cannot regulate". That would indeed be idiotic. It's a middle ground between "must judge everything for yourself" and "someone else determines what you have access to". Someone else does the evaluation and tells you whether they think it's worthwhile, but you choose whose recommendations to listen to (or ignore, if you please).

No, I am not okay with bans like that. You should be able to knowingly buy products with mercury in them. Obviously if someone is selling products containing mercury and not disclosing that fact, passing them off as safe to handle, that would be a problem and they would be liable for any harm that resulted from that. But it doesn't justify a preemptive ban.

3 more...

Who is enforcing this and how?

Liability would be decided by the courts or another form of binding arbitration. Obviously. Harming someone through action or negligence is a tort, and torts are addressed by the judicial branch. Both sides would present their arguments, including any scientific evidence in their favor—the FDA or similar organizations could weigh in here as expert witnesses, if they have something to offer—and the court will decide whether the vendor acted reasonably or has liability toward the defendant.

If you knowingly sell me a car with an engine about to fail, you are in no way accountable.

If you knew that the engine was about to fail and didn't disclose that fact, or specifically indicate that the vehicle was being sold "as-is" with no guarantees, then you certainly should be accountable for that. Your contract with the buyer was based on the premise that they were getting a vehicle in a certain condition. An unknown fault would be one thing, but if you knew about the issue and the buyer did not then there was no "meeting of the minds", which means that the contract is void and you are a thief for taking their payment under false pretenses.

Anyway, you continue to miss the point. I'm not saying that everyone should become an expert in every domain. I'm saying that people should be able to choose their own experts (reputation sources) rather than have one particular organization like the FDA (instance/community moderators) pre-filtering the options for everyone. I wasn't even the one who brought up the FDA—this thread was originally about online content moderation. If you insist on continuing the thread please try to limit yourself to relevant points.

1 more...

The most valuable thing is an experienced team who thoroughly understand both the specifications and the implementation as well as the reasoning behind both. Written specifications are great as onboarding and reference material but there will always be gaps between the specifications and the code. ("The map is not the territory.") Even with solid specifications you can't just turn over maintenance of a codebase to a new team and expect them to immediately be productive with it.

with books there's basically no reasonable way to create an ebook from a hardcopy

On the contrary, tons of books have been digitized from hard copies through a combination of OCR and manual editing. (E.g.: Project Gutenberg.) The same basic process works for both printed books and pages displayed on an e-reader. It's quite tedious but not exactly difficult. Anyone with a smartphone can submit usable scans, though some simple DIY equipment speeds up the process and improves the quality, and OCR is getting better all the time.

In the worst case the book can simply be retyped. People used to copy books by hand after all, using nothing more sophisticated than pen/quill and paper/parchment/papyrus. Unlike in those days the manual effort is only needed once per title, not per copy.

3 more...

The average person would just download it. Only one needs the equipment to digitize it. And that equipment isn't as specialized as you seem to think. For printed (mass-produced) books you can just cut the pages from the spine and feed them in batches through an automated document feeder, which comes standard with many consumer-grade scanners. Automated page-turning on an e-reader can be done with a software plugin in some cases, or externally with something like a SwitchBot. Capturing copy-restricted video is frankly much more involved, and that hasn't stopped anyone so far.

1 more...

Historically speaking, people have gone to the trouble of manually digitizing hard copy books to distribute freely. There were digital copies of print books available online (if you knew where to look) before e-books were officially available for sale in any form. That includes mass-market novels as well as items of interest to historians. Ergo, your scepticism seems entirely unjustified.

OCR is far from perfect (though editing OCR output is generally faster than retyping), but even without it we have the storage and bandwidth these days to distribute full books as stacks of images if needed, without converting them to text. The same way people distribute scans of comics/manga.

Examples of local commands I might run in tmux could include anything long-running which is started from the command line. A virtual machine (qemu), perhaps, or a video encode (ffmpeg). Then if I need to log out or restart my GUI session for any reason—or something goes wrong with the session manager—it won't take the long-running process with it. While the same could be done with nohup or systemd-run, using tmux allows me to interact with the process after it's started.

I also have systems which are accessed both locally and remotely, so sometimes (not often) I'll start a program on a local terminal through tmux so I can later interact with it through SSH without resorting to x11vnc.

They should still be using the CPU's built-in AES hardware acceleration, yes? It seems they have good reason not to trust the SSD to handle the encryption but that doesn't mean it has to be entirely implemented in software. CPU-accelerated AES shouldn't be that much slower.

Except it's not even that indirect. The government of Texas invented this novel class of private liability, and their courts are the ones enforcing it. That's the same as banning it themselves, and blatantly unconstitutional.

I'm a bit surprised they didn't implement this as a tax. That would be just as bad, but the federal government has a long history of imposing punitive taxes on things they aren't allowed to ban; it would have been harder to fight it that way without forcing an overhaul of the entire tax system… and politicians are so very fond of special-purpose taxes and credits.

Nothing says that the owner/buyer of a car has to be the one who drives it. You could buy a car and have someone else drive you around. Or just buy one for someone else to use—for example a parent who doesn't drive could buy a car for their child who has a license. Or vice-versa. Either way there is no reason for the buyer to need a license.

In what sense do you think this isn't following the email standard? The plus sign is a valid character in the local part, and the standard doesn't say how it should be interpreted (it could be a significant part of the name; it's not proper to strip it out) or preclude multiple addresses from delivering to the same mailbox.

Unfortunately the feature is too well-known, and the mapping from the tagged address to the plain address is too transparent. Spammers will just remove the label. You need either a custom domain so you can use a different separator ('+' is the default but you can generally choose something else for your own server) or a way to generate random, opaque temporary addresses.

If you want to talk about non-compliant address handing, aside from not accepting valid addresses, the one that always bothers me is sites that capitalize or lowercase the local part of the address. Domain names are not case-sensitive, but the local part is. Changing the case could result in non-delivery or delivery to the wrong mailbox. Most servers are case-insensitive but senders shouldn't assume that is always true.

To put it another way: do you think we should have the FDA? Or do you think everybody should have to test everything they eat and put on their skin?

There is a middle ground. The FDA shouldn't have the power to ban a product from the market. They should be able to publish their recommendations, however, and people who trust them can choose to follow those recommendations. Others should be free to publish their own recommendations, and some people will choose to follow those instead.

Applied to online content: Rather than having no filter at all, or relying on a controversial, centralized content policy, users would subscribe to "reputation servers" which would score content based on where it comes from. Anyone could participate in moderation and their moderation actions (positive or negative) would be shared publicly; servers would weight each action according to their own policies to determine an overall score to present to their followers. Users could choose a third-party reputation server to suit their own preferences or run their own, either from scratch or blending recommendations from one or more other servers.

10 more...