I’ve spent a lot of time the past few weeks thinking about technology in general and proptech (really, restech) in particular. Those who know me know that I am not one prone to hype. In fact, my entire career has been spent being a bit of a contrarian…fighting against hype and leaning into my belief that multifamily rental housing is truly a “Goldilocks Zone” business—one in which you can hurt yourself by investing too little in tech but you can also hurt yourself by investing too much. The optimal spend (in money and effort) is to find the amount that, like the temperature in baby bear’s porridge, “is just right.” [Note to all readers, the use of the m-dash earlier in this paragraph is not from genAI. This blog is human-written 😊].
Despite my general proclivity to avoid sweeping statements, I really think that historians decades from now may create a new calendar based on February 5, 2026- the day Claude 4.6 went into production. Time before February 5 will be referred to as the new “BC” (“before Claude”) and time after that as “AC” (“after Claude”).
I don’t arrive to that conclusion lightly. Prior to 4.6, I knew AI was important. That’s why we launched our product-oriented rebaAI initiative in April of last year quickly followed by our internal AI initiative, ElevAIte. But in the words of my partner/co-founder and our CPO, Chris Brust, AI (we were using GitHub Copilot) was a good “junior developer.” It absolutely could speed development cycles up and do some amazing things, but you had to have experienced developers review its work.
In short, “vibe coding” wasn’t quite ready for release-ready product coding.
Claude 4.6 has radically changed this. Now, it’s a senior developer. It’s as good as our best, most experienced engineers. We still have to put it through code reviews and our normal PR release approvals, but we have to do that with humans as well. And with AI-code review tools, that process is radically faster as well.
A year ago, I thought our industry was 3-5 years away from truly agentic AI. Today, I realize we are only about a year away—and some would argue we’re already there.
A casualty of this “change in kind, not just degree” is that many of the old rules simply don’t apply. All of my “go to” moves, honed by decades of experience, might actually get in the way. We need to be bold and daring, even in an industry like MFH that normally rewards caution and risk avoidance.
Which brings me to the main point of this blog. One of the oldest strategic IT paradigms that needs to die is the “buy vs build” decision model. Most of my career has been built on making wise buy vs build decisions. In my early days with Archstone, we were known for building (and co-developing) because there were no “buy” options. Just one example: if we wanted online leasing in 2004, we had to build it. By the late ‘00s, there started to be more buy options, though those v1s were often inferior to what we had built (or could build). However, I began to advocate changing to more of a buy strategy. Even if the v1 version was only 75-80% as good as what we needed, future versions would be better and we didn’t have the resources to keep up; nor did we have the same breadth of install base to amortize development costs that the vendors had.
This paradigm was a big part of my consulting practice, D2 Demand Solutions. One of our lines of business was helping with bespoke BI builds because there was no “buy” option that had the configurability needed for good BI. Honestly, the initial inspiration for REBA's whole reason for existence was to offer a build rather than buy option for BI. After Chris and I had worked independently and together on a half dozen large, bespoke builds, we saw the opportunity to productize what we were doing and give the option to buy a MFH purpose-built BI platform with quick implementation rather than invest seven figures of capital on a 2-3 year project just to get a v1.
The growth of DIY AI seems to change this balance very much in favor of build. After all, anyone now has access to a “senior developer,” so who needs proptech vendors? We can each be our own micro-tech vendor, right?
The more I’ve thought about this, the more I’ve realized the mistaken allure this presents…but also the opportunity. While AI like Claude 4.6 make DIY easier than before, the basic rules still apply
- Arguments for DIY include the obvious strengths on control, both in content and timing of enhancements (this latter can be a false allure when the energy and funds for “v2” often fall prey to budget constraints and can’t compete with the value of other projects’ v1s).
- Arguments for buying solutions include avoiding owning the technical debt, the ability of vendors to amortize costs over a broader portfolio and to attract better technical talent with clearer career trajectories.
However, new advances in technology platforms (e.g. Microsoft Fabric and agentic AI orchestration tools) make the whole “buy or build” conundrum a false dichotomy. That’s why I now advocate a new approach—buy AND build! It’s what I would do if I were a CIO or CTO today.
With somewhat limited data science and product/engineering resources, I would want to put them to their best and highest use creating specialized AI, dashboard and reports adjacent to, and on top of, my data foundation rather than spending time and effort building that “plumbing” and maintaining all of that tech debt (e.g. any time a system of record changes something in their data feeds/APIs, let the vendor deal with such mundane changes while my team focuses on key analyses). This would allow me to benefit from my tech company partners’ strengths, while still creating our own flavors of our special sauce.
Goldilocks didn't settle for “too hot” or “too cold.” She found the option that was just right. The same principle applies here. Buy the foundation, build the advantage and we should be grateful that we no longer have to choose.

SHARE