jckarter

everyone already knows i'm a dog

the swift programming language is my fault to some degree. mostly here to see dogs, shitpost, fix old computers, and/or talk about math and weird computer programming things. for effortposts check the #longpost pinned tag. asks are open.


email
mailto:joe@duriansoftware.com
discord
jckarter

morayati
@morayati

I was impressed by most of this article in Curbed, which did some actual reporting on the discrepancy between landlords saying rents in New York City were surging "because people came flooding back" and the fact that, according to all data, people did not actually come flooding back. The reporting goes to impressive shoe-leather lengths -- the author even contacts the New York City Water Board for clarification on how much waste they processed in 2021! Then comes the end, which identifies the culprit as The Algorithm -- specifically, that of RealPage, a Texas-based "property management software" company that sells software-as-a-service to computerize the process of landlords doing landlord shit -- and leaves it there. This section of the article contains a lot of "it's hard to know who all their customers are" and "if this happened, it could do this" speculation, but few specifics. Naturally, I wanted to know more.

RealPage's online training (source: RealPage's training video for AI Revenue Management, 2022; it just gets more infuriating from here, folks)


lizthegrey
@lizthegrey

"It's not collusion if we just happen to all use the same arm's length software company to suggest to us what others are charging/paying and thus what the "market rate" ought to be!"

Rent-fixing happens when all the institutional landlords just "happen" to agree rents should go up at the same time because the Algorithm said so. If you're wondering how employers get away with what ought to be illegal wage-fixing -- they pay databank companies to find out what others are paying, and report their own pay structures to those companies to ensure that nobody is out of step with what the "rest of market" is paying.

They never talk directly to their peers, but the central data broker is absolutely in effect doing that for each firm to collude.


You must log in to comment.

in reply to @morayati's post:

so my takeaway here is that the AI just tells people what the hiring market will bear in terms of rents and then sets that, and the entire actual reason anyone uses it is it gets around price fixing laws by having a middleman ML fall guy instead of just having investment-landlord conference calls like they did pre-2005.

it's good reporting, I just wish any of the "AI" stuff actually used ML instead of like, here's the ML as the data massage equivalent of p-hacking, and a frontend for it that obscures what you're actually doing, since people who work for investment landlords usually aren't in a position to ask many questions mostly out of ignorance'

but it makes sense, actual applications would take work and lying to people is easy as long as the product delivers what it claims

i think it's because that's the only kind of "AI" that actually bears consistent fruit: simmer the sauce down until you've found the key statistics and then just do a normal analysis on that. ML is almost always just an agonizing roundabout way to pay several people a huge chunk of change to come up with something that comes to mostly the same conclusion but is influenced by the wrong signals by 10%

the problem with ML is mostly that it has to make money for investors, so it keeps chasing general purpose applications for the free marketing, but the only place it's actually useful is as like... your in-house data analysis team makes a Bespoke Categorization and Flagging Engine for whatever data you're dealing with, and you treat it like a living machine (like you would a steam engine or Connections Museum telephone rack) instead of some un-tweakable black box you bought from the store. But that's still a few developer-graduating-classes off

yeah. and it honestly seems to me like most of the cases where ML would actually be more profitable than an alternative are when it is incorporating associated feedback that it shouldn't (i.e., being racist), and the promotional kicker is that it's plausibly deniable that you meant to do that

worth noting that (I am not a lawyer, but) the "AI" stuff seems pretty tangential to the lawsuits -- the basic idea is that the more market share the company gains, and thus the more data about clients' rents it has and the more of them using that data in turn to set their own rents, the more it becomes potential indirect price setting

I mean you yourself acknowledge that previous journalism pieces did a lot of thorough work but not all of it, yeah? and you built on their research. you could apply that compliment to your own piece (which does a lot of thorough work!) and let others build on its gaps when they see it :3

in reply to @lizthegrey's post: