Abstract: This article explores the complex relationship between Artificial Intelligence (AI) and Indigenous peoples, framing AI as an extension of colonial systems that continue to extract, distort, and commodify Indigenous knowledges, lands, and bodies. We introduce Uncle Chatty Gee and Aunty Lexi as playful yet critical personifications of ChatGPT and Perplexity AI to explore how Indigenous communities engage with and interrogate AI technologies. This framing situates AI within Indigenous relational frameworks, making it culturally legible while inviting scrutiny. Our paper argues that generative AI perpetuates algorithmic settler colonialism through digital and algorithmic systems, reinforcing biases and erasing Indigenous knowledge. We highlight the risks and harms posed by AI to Indigenous communities, including data extraction without consent, distortion of digital narratives, and environmental impacts on Country. We also discuss Indigenous resistance to algorithmic settler colonialism through data sovereignty movements and Indigenous-led AI governance models. As Indigenous scholars, we emphasise the urgent need for Indigenous peoples to not only critique but also shape the future of AI in ways that centre relationality, sovereignty, and Indigenous aspirations, ensuring active participation in the AI landscape to uphold cultural integrity and digital justice.