Some moderation strategies of online content have targeted the individuals believed to be the most influential in the diffusion of such material, while others have focused on censorship of the content itself. Few approaches consider these two aspects simultaneously. The present study addresses this gap by showing how a socio-semantic network analysis can help identify individuals and subgroups who are strategically positioned in radical networks and whose comments encourage the use of violence. It also made it possible to identify the individuals and subgroups who act as intermediaries and whose statements are often the most violent.