M-Tech Labs AIThe groundwork that makes AI safe to turn on.
Copilot, agents and assistants are only as trustworthy as the data, permissions and policies underneath them. We help you get that foundation right — drawing on the governance and compliance depth of M-Tech Systems.
You might be here because…
Most AI consultancy engagements start with one of these. If any sound familiar, there's a conversation to have.
- 01
Copilot licences on hold
You've bought (or are about to buy) Microsoft 365 Copilot, but nobody's comfortable flipping the switch until someone's looked at what it can actually see.
- 02
Shadow AI is already happening
Staff are pasting client data into ChatGPT, Gemini and whatever's in their browser — with no policy, no audit trail and no sign-off.
- 03
SharePoint permissions are a liability
Fifteen years of "share with everyone" links, inherited folders and orphaned groups. Fine until an LLM gets a search API over the top of it.
- 04
A regulator, auditor or client has asked
You need a defensible answer on how AI is used, where the data goes, and what controls exist — and "we're thinking about it" isn't going to cut it.
Where we help.
Short, focused engagements that leave you with a working map of risk, a cleaned-up tenancy and a policy your teams will actually follow.
Data governance & classification
Permissions & identity hygiene
Compliance & regulatory alignment
Responsible AI policy
Vendor & model due diligence
Three ways to start.
Most clients begin with a Readiness Review, then move into a Sprint or a Retainer once the shape of the work is clear.
Readiness Review
A focused audit of your tenancy, permissions and data classification posture against the AI tool you're about to deploy. Ends with a prioritised remediation plan and a go / no-go recommendation.
- Tenancy & oversharing audit
- Risk register with severity
- Remediation roadmap
- Exec-level readout
Governance Sprint
We work alongside your team to actually fix the issues a review uncovers — sensitivity labels, DLP policies, retention, identity hygiene — and stand up the policy and training that keeps it from drifting back.
- Purview / sensitivity labels
- DLP & retention policies
- Permission remediation
- AI acceptable-use policy
Advisory Retainer
A standing advisory relationship for teams rolling out AI over months, not weeks. Quarterly reviews, change-assessment on new tools, regulator-ready documentation kept current.
- Quarterly posture review
- New-tool assessments
- Policy version control
- Incident support on call
What a readiness engagement changes.
- 01
Confidence to turn Copilot on
You know exactly what it can see, who can see it, and what it would never surface. No nasty demos.
- 02
A defensible audit trail
When the regulator, the auditor or the board asks how AI is being used — you have answers, not a shrug.
- 03
Fewer surprises at rollout
Permission and labelling problems are caught before deployment, not during a live incident.
- 04
Teams that know the rules
Clear, short guidance on what's fine, what needs review, and what's off-limits — so adoption doesn't stall on fear.
The groundwork most AI shops don't do.
AI consultancy from a team that already runs production security and compliance — not a dev firm that picked up governance as an afterthought.
- 01
Governance depth, not just dev depth
Most software firms can build you an AI assistant. Very few can tell you whether it's safe to plug it into your tenancy. We do both.
- 02
Assurix + NCSC CAF 4.0 aligned
The same controls we apply to AI rollout — privileged access, supplier risk, monitoring, incident response — are the ones Assurix verifies live against CAF 4.0.
- 03
Backed by a working MSP
M-Tech Systems runs production identity, security and compliance for the organisations we build for. The AI work sits on top of an assurance practice, not alongside it.
We map AI use against the frameworks your auditor already knows.
Not sure if your tenancy is ready for Copilot?
A two-week readiness review tells you where the risks are, what to fix first and what 'good' looks like — before you spend on licences.