Illuminate

Vendor AI Usage Policy

Illuminate requires all third-party vendors to disclose their use of artificial intelligence in connection with services provided to us or our clients. This policy applies to any vendor that processes, stores, or has access to client data, financial information, or account-level information.

Disclosure Requirements

Vendors must disclose:

  1. Whether AI or machine learning models are used in any part of the services provided to Illuminate
  2. What data inputs the models rely on, including whether client data is used for training, fine-tuning, or inference
  3. Whether AI outputs influence decisions affecting client accounts, portfolios, or personal information
  4. How AI-generated outputs are reviewed, validated, and overridden when necessary
  5. Whether any AI processing is performed by subprocessors or third-party model providers

Data Protection Standards

Vendors using AI must confirm:

  1. Client data is not used to train general-purpose models or shared models serving other customers
  2. Client data is not retained in model weights, logs, or caches beyond the scope of the contracted service
  3. AI processing environments meet the same security and encryption standards required under the vendor agreement
  4. Opt-out mechanisms exist where applicable for data inclusion in model training or improvement

Ongoing Obligations

Vendors must notify Illuminate within 30 days of any material change to their AI usage, including the introduction of new models, changes in data handling practices, or shifts in subprocessor relationships involving AI. Illuminate reserves the right to request documentation, conduct reviews, or terminate the engagement if AI usage falls outside acceptable parameters.

Accountability

The CCO reviews all vendor AI disclosures as part of the broader vendor review process. Vendors that fail to disclose AI usage or that use client data in ways not covered by their agreement are subject to escalation, remediation requirements, or offboarding.