NCII is the act of sharing intimate images or videos of someone, either on or offline, without their consent.
FAQ
Non-consensual Intimate Image Sharing
Intimate images are images and videos of people who are naked, showing their genitals, engaging in sexual activity or poses, or wearing underwear in compromising positions.
NCII is increasingly being recognised as a crime in different parts of the world. You may want to reach out to someone qualified to give you legal advice.
In the UK, for example, it is an offence to disclose a private sexual photograph or film without the consent of an individual who appears in the photograph or film, with the intention of causing distress. Learn more about intimate image abuse laws in England and Wales, Scotland, Northern Ireland, and the Republic of Ireland here.
See our Resources and Support page to learn more about NCII.
Anyone can be a victim of adult intimate image abuse, regardless of their gender and sexuality.
If the image is of someone under the age of 18 – even if it is of yourself and you are now over 18 – please contact the National Center for Missing and Exploited Children or Internet Watch Foundation for support.
In some cases, the sharing of intimate content may be an act of perceived “revenge” to a person after a relationship breakdown, though this horrible act can be done for any motivations, or none at all.
If the images are of you, you are over 18 years of age, and it is intimate in nature or depicts any sexual act, then you can submit the images to be hashed.
Regardless of who else is in the image, if you are depicted in an intimate image and it is shared without your consent, this is NCII and you can submit this image to the hash bank.
If your intimate images have already been shared online, your first step should be to try and report them to the platforms.
Bumble
- Bumble Safety Center
- Block and Report: how to report someone directly to a member of the Bumble team
Discord
- Safety Center
- Reporting directly to Trust and Safety
- Submitting an Abuse Report: Fill out this form, including all relevant information (message IDs, channel IDs, etc)
- Google Help Center
- Removal request for NCII: You or an authorized representative can submit a request to remove links to the content from Google search results.
Messenger
- Messenger Help Center: Reporting Conversations
Snapchat
- Reporting Abuse
- Safety Resources and Support
- Wellbeing resources include in-app reporting tools, so users can anonymously report when their friends are at risk from harmful content. Snap then shares resources to both the person reporting and the victim.
TikTok
Twitch
- Help Center
- Blocking and Reporting Contacts
- Reporting someone means WhatsApp receives the most recent messages sent to you by the reported user or group, as well as information on your recent interactions with the reported user.
YouTube
- YouTube Help Center
- YouTube Commitment to Managing Harmful Content
- Reporting inappropriate content
- When something is reported, it is automatically taken down. You can report videos, playlists, thumbnails, links, comments, live chats, channels, and ads. YouTube staff review reported videos 24 hours a day, seven days a week.
- Users can also report inappropriate search predictions
If we have not covered your question, please do get in touch with our team at stopncii@swgfl.org.uk and we will try to add this into our FAQs in the future.
Please note that this email is not a support email and cannot offer any additional support for removing/reporting NCII. If you feel you need additional support, please refer to our support page.
We are not able to recover any case numbers, PINs or remove any hashes, and any requests of this nature will not get a response.
The StopNCII.org tool
The tool works by generating a hash from your intimate image(s)/video(s). Image hashing is the process of using an algorithm to assign a unique hash value to an image. Duplicate copies of the image all have the exact same hash value. For this reason, it is sometimes referred to as a ‘digital fingerprint’. StopNCII.org then shares the hash with participating companies so they can help detect and remove the images from being shared online.
A digital fingerprint – or a hash as it is technically known – is like a barcode that is attached to an image/video when put through our technology. The hash is then stored in the StopNCII.org bank and shared with partner platforms. Hashes are then compared to every image uploaded to a partner platform and if it matches, the image is removed. Algorithms we use are PDQ for photos and MD5 for videos. They are open-sourced and are industry standard for applications like ours.
Once you have created your hash and submitted it to the StopNCII.org bank it will be sent out to participating companies’ platforms. If someone tries to upload a matching image, the platforms will review the content to check if it violates their policies and take action accordingly.
No-one else will see your images when the hash is generated, the images will not leave your device. If someone tries to upload a matching image on one of our participating companies’ platforms, they will review the content on their platform to check if it violates their policies and take action accordingly.
Unfortunately, this is not recoverable in any way, so please keep this information safe. If you lose your case number or PIN you will not be able to check your case status or withdraw your hashes. We cannot help you reset your PIN or retrieve your case number, since we do not save that information.
During the submission process, you can choose to have your case number sent to you using a system generated email. For your privacy and safety, we will not store your email address if you do this.
We want to keep the amount of data you need to share minimal, so we don’t ask, or need, your email address to create a case. All you need to do is hash your images and keep hold of your PIN and case number. This way, minimal personally identifying information (only what we really need to run this service) is stored by us.
StopNCII.org uses image matching technologies that have been widely used by the tech industry to detect exact matches of previously known violating content from being reshared to prevent harm on their platforms.
If an image that has been hashed is edited through cropping, filters added or a video clipped, the original hash may not recognise the image. The new image will need to be hashed separately.
Images and video should be in jpeg, jpg, png, gif, psd, tiff, xcf, tga, miff, ico, dcm, xpm, pcx, bmp, mp4, mov, avi, qt and wmv. Videos that are bigger may need to be edited to smaller sizes and hashed individually. The maximum number of hashes that can be uploaded at one time is 20, which is to minimise the impact on processing of individual devices.
Common reasons why your hash may have failed include:
- Your image may not have been the correct format. Acceptable formats are jpeg, jpg, png, gif, psd, tiff, xcf, tga, miff, ico, dcm, xpm, pcx, bmp, mp4, mov, avi, qt and wmv.
- If you tried submitting a video, the file size might be too large for your browser to support this process.
- Alternatively, there may have been a technical error.
The original image will work best, but if you only have a screenshot you can use that. Before your submission, just remember to remove anything around the image (e.g. any borders) without cropping the image itself.
No. Your images will never leave your device and they will never be saved by us. We will only store the digital fingerprints, also known as hashes.
Our partners will only be able to access hashes to match against images uploaded to their platforms.
No. You’ll need to start a new case.
Yes, you can withdraw your case at any time. Go to your Case Status page and select the ‘withdraw‘ button. In order to access Case Status, you will need your case number and PIN which were created during submission.
While you have the ability to withdraw, please be aware that participating companies reserve the right to continue enforcing their policies once they’ve acquired knowledge of the hash.
Resources and support
You can find services offering support here.
Eligibility Questions
You need to meet the following criteria in order to use this tool:
- If the image is of you and you have access to the image.
- If the image is intimate in nature (nude or sexual).
- If you are over 18 years old in the image.
No you can’t. We can only accept hashes from you if you are the person in the content. Please encourage the person in the image (if you know them) to start the process themselves.
No, you can be anywhere in the world.
If you are under 18 in the images and videos you can still get help. Please see Resources and Support.
Current participants include Facebook and Instagram. We are constantly looking at expanding our partners.