Understanding AI
How one Kiwi tech company is taking responsible steps forward in bringing AI to its users.
AI inspires hype, hope and fear – and with good reason. It’s advancing at an astonishing speed, and some of the major players in this space seem OK with asking for forgiveness, rather than permission, when pushing boundaries with this new technology.
We believe that there should be a period of learning and establishing best practices regarding data security and privacy, particularly in the context of customer data.
First, get to know AI
The quickest way to a blunder in data security and privacy – in any aspect of a data-driven business – is to not have a solid understanding of how things work. Learn what current generative AI does, what large language models mean, and the flaws that exist in the current technology.
The more you know, the more able you will be to execute something that will provide additional value to your customers and your business.
Understand your legal obligations
Under the Privacy Act (2020), all New Zealanders have a right to privacy, and understanding that legislation as it relates to new AI tools is very important to ensure that you don’t violate the law.
The Office of the Privacy Commissioner published a whitepaper titled Artificial intelligence and the Information Privacy Principles in September 2023. We recommend reading this document, as it clearly outlines the responsibilities of any New Zealand organisation using AI tools with customer data.
Do you need AI at all
Now that you understand more about AI, is it the right tool for the task at hand? AI is still in its infancy, though it can sometimes feel like it’s not, due to the rapid pace with which things are developing. Without careful application, it’s not a silver bullet that will transform your business.
Does using AI tools significantly benefit the value you provide to your customers, or can the same outcome be achieved in some other manner?
For example, we’ve spent much time attempting to accurately clean irrelevant information from transaction merchant descriptions for users, with limited success. However, we’ve discovered that large language models are perfect for this task.
Consent is king
Be sure that your users have informed consent before providing the data. Be clear with your customers about which AI service you’re sending data to and for what purpose.
Many people are very leery of AI, and better-informed people can become increasingly so. When you use these services, be sure that you have your customers at the forefront of your mind.
Choose your AI platform wisely
Different platforms handle data and data storage differently, and it pays to understand this difference. Data inputs might be used to train the AI further, and you must know your chosen AI tools’ stance on this.
For example, inputs entered into the ChatGPT application are used for further AI training purposes, while use of the API that backs ChatGPT does not use data for training. Knowing this distinction and communicating it to your customers is vital.
Don’t store data if you don’t need to
As always, the best data security comes when you don’t hold the data at all. When users interact with the AI features in PocketSmith, both the data sent to the AI and the response returned stays within the user’s web browser.
Not storing the data is the simplest way to maintain user privacy because we don’t have access to it. We’d recommend taking a similar approach wherever possible, even if it makes it slightly more challenging to improve your AI-oriented services.
Make AI work for your customers and your business
If used with intention and care, new generative AI tools can be a game-changer for your business and your customers.
Everyone is learning about the challenges and opportunities presented with the new technology. But with clear and consistent collaboration with your customers, you can provide something that is genuinely helpful to them, and aligns with the values of your business.
This article was originally published in the June 2024 issue of NZBusiness magazine. To read the issue, click here.