Psycholinguistic Aspects of Humanitarian Component of Cybersecurity
Abstract
Introduction. The paper focuses on language means exploited by social engineers in their activities in terms of humanitarian aspects of cybersecurity. The goal of this research is to analyze the methods and techniques employed by social engineers in their malicious activity and its features from a psycholinguistic point of view for further development of counteraction mechanisms.
Methods. To obtain results we used the following methods: primary source analysis, analysis of spoken and written speech and speech products, and intent analysis.
Results. The activity theory has been successfully applied to consider the key features of social engineers’ work. On the base of AT we presented a three-component model which we may consider only in the case of a social engineer’s successful attack (action).
Based on the analysis of the sources, we distinguished the types of spoken and written communication actions (these types correspond to direct and indirect actions), used by social engineers to affect the cognitive processes for retrieving “sensitive data” and confidential information. Besides, we also categorized psychological and language means, which social engineers evidently apply in their activities. We stress that in most cases social engineers’ activities are aimed at a) affecting the person’s emotions and feelings; b) blocking rational and critical thinking; c) manipulating moral and ethic values, and d) using positive incentives that have an interest to a user. Taking into account the abovementioned types of communication, psychological and language means, we systematized and described the general techniques of using oral and written forms of language and technologies: 1) techniques related to the use of spoken speech; 2) techniques related to the use of written speech; 3) techniques related to the use of USB flash drives, applications, and program software.
The findings are applicable for developing a mechanism to counter social engineers’ attacks and contribute to improving the level of cyber literacy.
Downloads
References
Actual cyber threats – 2018. Trends and forecasts (2018). Positive Technologies. Retrieved from: https://www.ptsecurity.com/upload/corporate/ru-ru/analytics/Cybersecurity-threatscape-2018-rus.pdf [in Russian].
Binks, A. (2019). The art of phishing: past, present and future. Computer Fraud & Security, 4, 9–11. https://doi.org/10.1016/S1361-3723(19)30040-5
Bykov, V.Y., Burov, O.Y., & Dementievska, N.P. (2019). Cyber security in a digital learning environment. Information Technologies and Learning Tools, 70(2), 313–331. https://doi.org/10.33407/itlt.v70i2.2876
Carrol, J.M. (2003). HCI Models, Theories, and Frameworks: Toward a Multidisciplinary Science. Morgan Kaufmann Publishers Inc. San Francisco, CA, USA.
Chaldini, R.B. (2015). Psihologiya vliyaniya [Psychology of Influence]. Sant-Petersburg: Izd. Piter [in Russian].
Cole, M. (1996). Cultural Psychology: A Once and Future Discipline. MA: Cambridge University Press.
Cole, M., Engeström, Y. (1993). A cultural-historical Approach to Distributed Cognition. In Salomon, G. (Ed.), Distributed Cognitions: Psychological and Educational Considerations (pp. 1–46). New York: Cambridge University Press.
Damasio, A.R. (2001). Emotion and the Human Brain. Annals of the New York Academy of Sciences, 935, 101–106. https://doi.org/10.1111/j.1749-6632.2001.tb03475.x
Dawson, J., & Thomson, R. (2018). The Future Cybersecurity Workforce: Going Beyond Technical Skills for Successful Cyber Performance. Frontiers in Psychology, 9, 744. https://doi.org/10.3389/fpsyg.2018.00744
Dreibelbis, R.C., Martin, J., Coovert, M.D., & Dorsey, D.W. (2018). The Looming Cybersecurity Crisis and What It Means for the Practice of Industrial and Organizational Psychology. Industrial and Organizational Psychology, 11(02), 346–365. https://doi.org/10.1017/iop.2018.3
Engeström, Y. (1999). Activity theory and individual and social transformation. In Y. Engeström, R. Miettinen, & R.-L. Punamäki (Eds.), Perspectives on Activity Theory (pp. 19–38). https://doi.org/10.1017/CBO9780511812774.003
Ermakova, L., & Aidarov, Yu. (2009). Lingvistika protiv sotsialnoy inzhenerii. [Linguistics Versus Social Engineering]. Otkryityie sistemyi. SUBD – Open systems. SUBD. Retrieved from: https://www.researchgate.net/publication/307855981 [in Russsian].
Grachev, G.V., & Melnik, I.K. (2002). Manipulirovanie lichnostyu: organizatsiya, sposobyi i tehnologii informatsionno-psihologicheskogo vozdeystviya. [Manipulation of Personality: Organization, Methods and Technologies of Information and Psychological Impact]. Moscow: Izd. Algoritm [in Russian].
Hadlington, L. (2017). Human factors in cybersecurity; examining the link between Internet addiction, impulsivity, attitudes towards cybersecurity, and risky cybersecurity behaviours. Heliyon, 3(7). https://doi.org/10.1016/j.heliyon.2017.e00346
Hadnagy, Chr. (2018). Social Engineering: The Science of Human Hacking (2nd ed.). https://doi.org/10.1002/9781119433729
Hatfield J.M. (2018). Social engineering in cybersecurity: The evolution of a concept https. Computers & Security, 73, 102–113. https://doi.org/10.1016/j.cose.2017.10.008
Kasperski, K. (2005). Sekretnoe oruzhie sotsialnoy inzhenerii [The Secret Weapon of Social Engineering]. Kompaniya AyTi [in Russian].
King, Z.M., Henshel, D.S., Flora, L., Cains, M.G., Hoffman, B., & Sample, C. (2018). Characterizing and Measuring Maliciousness for Cybersecurity Risk Assessment. Frontiers in Psychology, 9, 39. https://doi.org/10.3389/fpsyg.2018.00039
Kuznetsov, M. (2007). Sotsialnaya inzheneriya i sotsialnyie hakeryi [Social Engineering and Social Hackers]. Peterburg: «BHV-Peterburg» [in Russian].
Leontiev, A.N. (1975). Deyatelnost. Soznanie. Lichnost [Activity. Consciousness. Personality]. Moscow: «Politizdat» [in Russian].
Li, G, Shen, Yu., Zhao, P., Lu, X., Liu, J., Liu, Ya., & Hoi, S. (2019). Detecting cyberattacks in industrial control systems using online learning algorithms. Neurocomputing, 364, 338–348. https://doi.org/10.1016/j.neucom.2019.07.031
Lively, Charles, E.Jr. (2003). Psychological Based Social Engineering. GSEC. Option 1, version 1.4b. Retrieved from: https://www.giac.org/paper/gsec/3547/psychological-based-social-engineering/105780
Mansfield-Devine, S. (2017). Bad Behaviour: Exploiting Human Weaknesses. Computer Fraud & Security, 1, 17–20. https://doi.org/10.1016/S1361-3723(17)30008-8
Marble, J., Lawless, W., Mittu, R., Coyne, J., Abramson, M., & Sibley, C. (2015). The Human Factor in Cybersecurity: Robust & Intelligent Defense. In Cyber Warfare: Building the Scientific Foundation (pp. 173–206). Springer International Puplishing. https://doi.org/10.1007/978-3-319-14039-1_9
Mitnik, K., & Saymon, V. (2004). Iskusstvo obmana [The Art of deception]. Kompaniya AyTi [in Russian].
Mouton, F., Leenen, L., & Venter, H.S. (2016). Social engineering attack examples, templates and scenarios. Computers & Security, 59, 186–209. https://doi.org/10.1016/j.cose.2016.03.004
Nicholls, J.G., Martin, R.A., Bruce W., & Fuchs P.A. (2001). From Neuron to Brain. Cellular and Molecular Approach to the Function of the Nervous System, Fourth Edition. Sinauer Associates (4th ed.).
Nygren, T.E., Isen, A.M., Taylor, P.J., & Dulin J. (1996). The influence of positive affect on the decision rule in risk situations: Focus on outcome (and especially avoidance of loss) rather than probability. Organizational Behavior and Human Decision Processes, 6.1, 59–72. https://doi.org/10.1006/obhd.1996.0038
Quigley, K. (2015). ‘Cyber Gurus’: A rhetorical analysis of the language of cybersecurity specialists and the implications for security policy and critical infrastructure protection. Government Information Quarterly, 32(2), 108–117, https://doi.org/10.1016/j.giq.2015.02.001
UN Documents. Elements for Creating a Global Culture of Cybersecurity. Retrieved from: http://www.un.org/ru/documents/decl_conv/conventions/elements.shtml]
Vanyushicheva, O.Yu., Tulupeva, T.V., Paschenko, A.E., & Tulupev, A.L. (2011). Klassifikatsiya psihologicheskih osobennostey sostavlyayuschih osnovu uyazvimostey polzovatelya pri ugroze sotsioinzhenernyih atak [Classification of Psychological Features that Form the Basis of User Vulnerabilities in Case of Threat of Social Engineering Attacks]. Trudy SPIIRAN – SPIIRAS Proceedings, 17, 70–99 [in Russian].
Vygotskiy, L.S. (2005). Psihologiya razvitiya cheloveka [Psychology of Human Development]. Moscow: Izd-vo «Smyisl; Eksmo» [in Russian].
Watson, G., Mason, A., & Ackroyd, R. (2014). Social Engineering Penetration Testing: Executing Social Engineering Pen Tests, Assessments and Defense. Boston. doi.org/10.1016/B978-0-12-420124-8.00011-9
Wertsch, V. James (1993). Voices of the Mind. Sociocultural Approach to Mediated Action. Harvard University Press.
Wertsch, V. James (1994). The Primacy of Mediated Action in Sociocultural Studies. Mind, Culture, and. Activity, 1(4), 202–208.
Workman, M. (2007). Gaining Access with Social Engineering: An Empirical Study of the Threat. Information Systems Security, 16(6), 315–331. https://doi.org/10.1080/10658980701788165
Yan, Zh., Robertson, T., Yan, R., Sung, Yo. P., Bordoff, S., Chen, Q., & Sprissler, E. (2018). Finding the weakest links in the weakest link: How well do undergraduate students make cybersecurity judgment? Computers in Human Behavior, 84, 375–382. https://doi.org/10.1016/j.chb.2018.02.019
Abstract views: 556 PDF Downloads: 307

This work is licensed under a Creative Commons Attribution 4.0 International License.