All
Search
Images
Videos
Shorts
Maps
News
More
Shopping
Flights
Travel
Notebook
Report an inappropriate content
Please select one of the options below.
Not Relevant
Offensive
Adult
Child Sexual Abuse
Length
All
Short (less than 5 minutes)
Medium (5-20 minutes)
Long (more than 20 minutes)
Date
All
Past 24 hours
Past week
Past month
Past year
Resolution
All
Lower than 360p
360p or higher
480p or higher
720p or higher
1080p or higher
Source
All
Dailymotion
Vimeo
Metacafe
Hulu
VEVO
Myspace
MTV
CBS
Fox
CNN
MSN
Price
All
Free
Paid
Clear filters
SafeSearch:
Moderate
Strict
Moderate (default)
Off
Filter
4:53
This AI Attack Hides in Plain Sight (Vision Jailbreak Revealed)
2 weeks ago
YouTube
CollapsedLatents
7:56
Star Vision FX2 Setup Tutorial
12.2K views
Dec 11, 2020
YouTube
Star Vision Campers & Caravans
1:28
ai jailbreak prompt making #coding #jailbreak
16 views
2 months ago
YouTube
💀panel_king💀
0:38
Flag prompt injection or jailbreak attempts and trigger alerts for potential misuse. Enable Prompt Shields in Communication Compliance. https://youtu.be/HhZo06QSTpU Use Microsoft Purview and Microsoft 365 Copilot together to build a secure, enterprise-ready foundation for generative AI. Apply existing data protection and compliance controls, gain visibility into AI usage, and reduce risk from oversharing or insider threats. Classify, restrict, and monitor sensitive data used in Copilot interacti
2 views
11 months ago
Facebook
Microsoft Mechanics
4:49
LLM Jailbreaking & Prompt Injection EXPLAINED | AI Security Threats You Need To Know About!
9K views
11 months ago
YouTube
AINewsMediaNetwork
6:28
Download VISTA FX for FREE 😱💻👉 CRIME RECONSTRUCTION INVESTIGATION software 🛑 ACCIDENTS
13.2K views
Apr 9, 2020
YouTube
FORENSICS.COLOMBIA | Héroe Vial (Segurid…
4:22
How to Jailbreak ChatGPT with the Many-Shot method
4.4K views
Feb 21, 2025
YouTube
AI Hacker 001
3:39
How to Jailbreak ChatGPT in 2025
53.6K views
11 months ago
YouTube
Austin's Lab
Microsoft: 'Skeleton Key' Jailbreak Can Trick Major Chatbots Into Behaving Badly
Jun 26, 2024
pcmag.com
34:04
Nintendo Switch Jailbreak Tutorial | Full Guide for Beginners
4.1K views
8 months ago
YouTube
LexTechph
1:57
Roblox Jailbreak Script GUI [Autofarm, Aimbot + MORE] (Pastebin 2022)
14.6K views
Jan 16, 2022
YouTube
Vapour Scripts
5:30
Jailbreak Gemini 2.0 with One Prompt: Unlock Full Access & Ask Anything 😲
1.3K views
8 months ago
YouTube
DevZoneX
5:02
New ChatGPT Jailbreak Prompt 2025 | Unlock Hidden Features & Unfiltered Responses!
811 views
1 year ago
YouTube
SENPAI KANNA
8:15
This ChatGPT Coding Prompt Can Code Anything... (Jailbreak Included)
18.8K views
Aug 13, 2023
YouTube
aichat.
6:09
How To FIX a Jailbroken Device! (Crashing & Unresponsive Solution)
105.3K views
Jul 26, 2018
YouTube
SimplyPops
1:30
How to Jailbreak ChatGPT with Images!
421.2K views
Apr 29, 2024
YouTube
Space Kangaroo
5:26
Top AI Expert Reveals BEST Methods to Jailbreak ChatGPT
2.2K views
6 months ago
YouTube
AllAboutAI
14:27
How to program old Mitsubishi FX plc using GX Developer Upload/Download I/O test (English)
34.4K views
Sep 28, 2020
YouTube
Ingus Sudints
1:59
How to Jailbreak iPad Air or Mini 2 Retina Display
8.4K views
Jan 5, 2014
YouTube
all things tech
3:01
[2025] How to use ChatGPT dan prompt - Unlock ChatGPT ( ChatGPT dan github tutorial)
33.5K views
11 months ago
YouTube
TenorshareOfficial
11:03
How to use Vision FX 2.0
16.5K views
Oct 18, 2024
YouTube
Corel Discovery Center
7:37
How to Jailbreak Google's Gemini AI
17.4K views
11 months ago
YouTube
Austin's Lab
6:00
ChatGPT 5 JAILBREAK Guide in 2026!
4.8K views
7 months ago
YouTube
David Willis-Owen
17:59
*OP & BEST* JailBreak GUI | AutoRob | Kill Aura | Aimbot | Fly | ESP | Infinite Ammo | PASTEBIN
63.1K views
Sep 29, 2021
YouTube
HeyImCode
8:48
How to HACK ChatGPT (Bypass Restrictions)
296K views
Feb 12, 2023
YouTube
Daniel K.
3:52
How To Jailbreak GROK 4.2 In 2026! (AGENT ROLEPLAY METHOD)
768 views
1 month ago
YouTube
twaai
6:48
Creating AI art with Vision FX
28.8K views
Aug 8, 2023
YouTube
Corel Discovery Center
1:21
Jailbreak Prompt
1.8K views
Mar 9, 2025
YouTube
SHAIDY
57:38
Preventing Threats to LLMs: Detecting Prompt Injections & Jailbreak Attacks
1.6K views
Feb 27, 2024
YouTube
WhyLabs
0:45
This ChatGPT Jailbreak Prompt Will Blow Your Mind 🤯
219.3K views
1 month ago
YouTube
Manish Holkar
See more
More like this
Feedback