THANK YOU FOR SUBSCRIBING

Leveraging AI in an Unpredictable Manner: A Case Study For It Professionals
Mike Wong, Chief Technology Officer, House730


Mike Wong, Chief Technology Officer, House730
Mike Wong is the Chief Technology Officer (CTO) of House730, a Hong Kong-based property platform. With a background in computer science, philosophy, and business administration from the Chinese University of Hong Kong, Mike is passionate about integrating technology with humanistic values. His articles explore the intersection of tech and humanities, offering insights on technological impacts and management perspectives in the digital age.
The AI Implementation Dilemma
Our online property search platform, with 100,000 listings, faced an unexpected challenge–10 percent of the queries from the users were using it as a general search engine for complex property queries. Instead of our intended simple address search, they asked questions like "What can I get for 10 million HKD in TST (Tsim Sha Tsui, a CBD area in Hong Kong with high-end residential buildings)?"
To address this issue, we need a Natural Language Processing (NLP) solution that can interpret users' intentions and search our database accordingly. Traditionally, we could develop our own NLP solution, but dedicating resources to handle just 10 percent of daily queries is difficult to justify. In the era of Generative AI, we believe Large Language Models (LLMs) could help us tackle this task more efficiently, potentially delivering comparable or even better results. But how do we begin implementing GenAI?
As IT professionals, we faced several key considerations:
• Computational resources: Both training and inference require substantial processing power.
• Unpredictable demand: We couldn't foresee how customers would perceive AI integration or its legitimacy in our use case.
• Scalability: We needed to rapidly scale our service up or down based on fluctuating demand.
Given our inability to predict the scale and required resources at this early stage, investing in training our own LLM model—or even running our own inference engine—was clearly unwise.
Azure Open AI Platform
We have partnered with Microsoft Azure to deliver our services to clients for over six years. Given this existing relationship and the availability of OpenAI services on Azure, we naturally opted for a strategic approach using the Azure OpenAI platform.
This approach allowed us to focus on solving business needs rather than technical details. We set up a usable NLP engine within a month that could interpret user input and fetch appropriate property listings—all without involving developers, simply through prompt engineering.
Compared to traditional NLP engines, our LLM-powered engine is bilingual, capable of extracting relevant information from mixed jargon and free text without direct engineering intervention. This is crucial in Hong Kong's dynamic environment, where users often send queries mixing Chinese and English. Before GenAI, tokenizing such queries was a headache for our developers. Now, it's no longer an issue, thanks to the power of the latest cutting-edge model running on the Azure OpenAI platform.
We set up a usable NLP engine within a month that could interpret user input and fetch appropriate property listings—all without involving developers, simply through prompt engineering.
Results and Benefits
With this approach, we solved the 10 percent issue. We can now return results for every legitimate query—whether bilingual, mixed with jargon, or complex—without significant investment in basic LLMs to kickstart the process.
This approach yielded several advantages:
• Enabled accurate prediction of service popularity after initial launch
• Facilitated informed decision-making on resource allocation
• Provided flexibility to transition to a self-hosted LLM model if necessary
• Reduced overhead in deploying an AI chatbot in production
• Allowed focus on addressing user pain points rather than technology for its own sake
Conclusion
Leveraging the Azure Open AI platform, we successfully implemented an AI solution that addressed our unique challenges. This case study demonstrates how IT professionals can harness AI technologies to solve real-world problems efficiently, even in unpredictable scenarios. The key lies in focusing on user needs and leveraging existing technologies strategically, rather than building everything from scratch.