Want to create interactive content? It’s easy in Genially!
TEST - Semaphore case study
CJBS Digital Learning
Created on August 2, 2024
Start designing with a free template
Discover more than 1500 professional designs like these:
Transcript
Creating a safe and secure future for the messaging app Semaphore
Semaphore case study
Press play to hear about the task you as a member of Semaphore's executive team, need to undertake to establish your vision for the future of Semaphore
Read the background information and then watch the video
Introduction
Over the past few years, several key players have dominated the messaging app market. However, recent market tremors have allowed fledgling ventures to gain traction. Users have become frustrated with specific incumbents, and with increasing concern over privacy and legislative changes in different countries, a possible space for a new player has been created.
Background
MOST POPULAR MESSAGING APPSBY COUNTRY
tHE COMPETITORS
The messaging space is a crowded one and you are competing directly against a large number of ventures. While they have much in common each of these platforms differentiates on crucial building blocks of their business and service offering. Each of the apps also varies in terms of popularity by region.
The differences
Click on the speech mark on each of the messaging apps below to find out more about these differences
Signal
Telegram
Many of the differences are routed in the level of privacy afforded to users and policies regarding the sharing of data
Regulation and politics
Press play to hear how regulations and politics areimpacting messaging platforms
A changing legal landscape
The following are examples of how Governments are stepping up legislation to tackle the concerns they have around the misuse of these messaging apps. Click on the hotspots for each of the four areas of legislation below to find out more about government intervention in this area.
The investigatory Powers Act 2016 UK
Earn IT Act
GDPR
The Online Safety Act 2023
MEASURES THAT MIGHT BE REQUIRED
The changing legal landscape may require the following measures to be put in place, depending on future cases and geographical location. Click on the hotsposts for each of the measures below to find out more.
1. Enhanced Moderation and Content Control
2. User Verification and Anonymity
3. Data Privacy and Protection
4. Reporting and Compliance Requirements
CHALLENGES FROMIMPLEMENTATION OF THE MEASURES
The implementation of these regulatory measures may pose the following challenges. Click on each of the challenges below to find out more.
1. Impact on User Experience
2. Legal and Financial Burden
3. Cross-Border Challenges
4. Innovation and Market Dynamics
The future of semaphore
As a co-founder and a member of Semaphore’s executive team, you need to work with your other team members to develop a future strategy for your venture. The main questions that you need to decide on are the following:
- How do you create a sustainable business model for semaphore? This should include a consideration of possible funding strategies and revenue models for the venture.
- What features will you focus on?
- What will Semaphore’s stance on privacy be, and how will this impact the venture and the functionality of Semaphore?
- Are there any other factors that you consider essential for the future of Semaphore and the impact the company makes?
GDPR
GDPR is data protection and privacy as regulated in the European Union and the European Economic Area. It sets guidelines for the collection and processing of personal information from individuals. It further addresses the transfer of personal data outside the EU and EEA areas. For example, WhatsApp's new privacy laws do not apply within EU countries as it violates GDPR.
Enhanced Moderation and Content Control
The act may require messaging apps to implement stronger content moderation policies. This could include the use of automated tools to detect and remove illegal or harmful content, such as hate speech, cyberbullying, or child exploitation material.
Signal is a popular messaging service for those focused on the security of their communications. In the face of controversy over WhatsApp's change to privacy policies, it found its user base balloon. In more recent times, they have improved their consumer-focused features, for example, the addition of wallpaper and stickers. It is managed by the non-profit Signal Foundation, previously Open Whisper Systems. One curious question is where its start-up capital originated. While they received later funding from Brian Acton of $50 million, this was 5 years after the launch of the app; Open Whisper Systems has never disclosed the identity of its funders. It has been alleged that at least part of its funding is linked to US national security interests. However, there is no evidence to suggest that this has impacted Signal. Signal continues to run through support from the community, making it an outlier in the tech industry. As an open-source app, developers worldwide continually contribute to working on and fixing the app. Signal does not collect user metadata, call logs, or data backups. In recent years, it has added new functionality to protect the privacy of its users, for example, a tool that automatically blurs faces in photos and group links. Signal is also exploring finding a way around the need for phone numbers and is considering an alternative to allow individuals to use unique usernames. There are, however, concerns about how Signal maintains its current model as the user base increases; it is based on the premise that people will continue to support it through grants and donations as they value their privacy enough. One concern that has been raised is the lack of policies and mechanisms to identify and remove back actors when these types of services are under greater scrutiny. While the app does state that the product cannot be used to violate the law, the company has so far taken a hands-off approach to moderation in line with its general positioning.
Signal
Data Privacy and Protection
If the act includes provisions on data privacy, messaging apps may need to enhance their data protection measures. This could involve securing user data against unauthorised access and ensuring that data collection and processing comply with privacy laws.
EARN IT Act
Currently being discussed; the bill would make it possible for safeguards in the law to be put in place to combat child sexual exploitation; they may include proactive, dynamic content scanning to identify content showing abuse and communication surveillance. If the act becomes law, companies may need to decide whether to accept liability for child exploitation facilitated through their platforms, this would undermine the privacy of end-to-end encryption by adding a backdoor for law enforcement, or remove end-to-end encryption. Major social media platforms say they can adequately identify child predators without eliminating or undermining end-to-end encryption, whivch would impact people's safety and security, as well as possibly be against consumer demand for privacy and security. Do you think this is the case? What is the right thing to do?
Reporting and Compliance Requirements
Messaging apps might be required to regularly report on their efforts to combat harmful content and to comply with law enforcement requests. Failure to comply could result in fines or other penalties.
Claiming to be the world’s largest standalone mobile app, WeChat offers far more functionality as a single platform for communication, social media, search engine, e-commerce and mobile wallet. WeChat dominates the Chinese market as other services are blocked. Its simple interface is described as idiot-proof. User activity is closely monitored by Chinese authorities who analyse and track accounts inside and outside of China. Censorship algorithms are used to detect and censor politically sensitive topics. WeChat is used by the Chinese government to facilitate mass surveillance; this has included concerns by other countries that the app poses a threat to national security.
Innovation and Market Dynamics
Stricter regulations could either stifle innovation, as companies focus more on compliance than on new features or spur innovation in areas like AI for content moderation. It could also impact market dynamics, potentially favouring larger companies that can more easily absorb the costs of compliance.
Cross-border Challenges
Messaging apps operating internationally might face challenges in complying with the Online Safety Act if its requirements conflict with laws in other countries. This could lead to legal disputes or the need to create region-specific versions of the app.
Impact on User Experience
These changes could affect the user experience. Enhanced moderation might make platforms safer, but it could also lead to over-censorship or slower message delivery times. Stricter verification processes could make signing up and using the app more cumbersome.
The online safety act 2023
This act aims to ensure that social media companies take responsibility and act upon ‘harmful’ online content. The act applies to any “user-to-user service” with a global duty of care for any service with a significant number of users in the United Kingdom. The act includes a provision to weaken end-to-end encryption and has led to platforms like Signal suggesting that they would leave the UK market rather than remove encryption.
The Investigatory Powers Act 2016
This authorises the government to compel communication providers to remove ‘electronic protection’ from any communication and data.
User Verification and Anonymity
The act might impose stricter user verification processes to reduce anonymity. This could help in tracking and penalising users who engage in illegal activities, but it might also raise privacy concerns among users who value anonymity.
Legal and Financial Burden
Implementing these changes can be costly and legally challenging for messaging apps, especially smaller ones. They might need to hire more staff for moderation, invest in new technologies for content detection and data security, and navigate complex legal landscapes.
This is the world’s most popular messaging application as of 2020, with over 2 billion users worldwide. It was initially launched in February 2009. WhatsApp allows users to send text and voice messages, share images, documents, user location and other content, and make voice and video calls. All a user needs to register, is a mobile phone number. For a period, users had to pay for the use of the application. However, this was rolled back in January 2016. Facebook acquired WhatsApp in February 2014 for $19.3 billion. In January 2021, WhatsApp announced that it would change its privacy policy to allow for an increase in the data shared with its parent company, Facebook. This led to the loss of millions of users. While WhatsApp cannot share the contents of messages sent within the app, a wide range of metadata is currently shared for use by Facebook companies. This includes your phone number and any other information you provide at registration; information about your phone like make, model and mobile company; your IP address; and any financial transactions made over WhatsApp. Further, under its privacy policy, it may share contacts, status updates, when people use WhatsApp, how long they use it, and unique identifying numbers for users’ phones. Recent updates to the app now allow users to sign into multiple devices (smartphones and desktops).
Released in 2013, Telegram has been particularly popular in countries with extensive surveillance and censorship, like Russia and Iran, which have made several attempts to ban it. It also saw a boost in its user numbers when WhatsApp was acquired in 2004 by Facebook. However, there are several concerns that it is not as secure as portrayed, with leading security experts claiming it has a wide range of issues. Unlike other services (for example, WhatsApp and Signal, which both use the same encryption protocol), Telegram uses its own encryption protocol, which they are not transparent about, leaving experts to note that it is not possible to know if it is secure or insecure. Telegram further differs from other services by not having end-to-end encryption by default; you need to opt in, which has likely led to some individuals thinking their communication is encrypted when it is not. Telegram also has a few features not found in other services, such as the secret chat function, which contains a self-destruct timer. In 2022, Telegram launched a premium service which removes ads and doubles the limits of file uploads and the number of groups you can join.