George Corser, PhD


I study digital privacy. My research publications examine potential confidentiality vulnerabilities in networks and applications which transmit, receive and store high volumes of high precision data.

I investigate intelligent transportation systems. Privacy in cars is an important social issue, and a complicated technical problem. Privacy is important because, under the law [7], people can assume a certain level of confidentiality when traveling in vehicles. Connected cars, autonomous cars and other emerging transportation modes throw the confidentiality assumption into question.

Voluminous data, nonstandard network protocols and nonstandard network node behavior complicate transportation system privacy protections. My research work specifically explores location privacy in vehicular ad-hoc network (VANET) systems. This work generalizes to mobile ad hoc network (MANET) systems and to Internet of Things (IoT). Volumes of location data generated by IoT/VANET communications will dwarf volumes generated by previous systems [1]. More data indicate more privacy threats. As users increasingly depend on IoT/VANET systems, they become more vulnerable to previously known and newly discovered attack vectors. VANETs use nonstandard network protocols. Dedicated short range communications (DSRC) enable vehicle safety applications, but strip away certain features of traditional Internet Protocol (IP) networks. See Fig. 1. VANET nodes move. They communicate with each other and with central databases. They gather location data from a global positioning system (GPS) and other systems. See Fig. 2.

The term, digital privacy, eludes computationally measurable definition. Early privacy advocates defined privacy as control over personal information [2]. They believed each person should possess data about themselves. Later privacy advocates defined privacy in terms of limiting access to personal information [3]. They recognized that data about people would exist on many different computers. They believed policies could be established where individuals would authorize the release of personal information, even from computers they did not own.

Neither control nor access limitation may be possible in contemporary systems. Safety messages in vehicle-to-vehicle (V2V) communications, for example, contain personal information, or data which can be linked to personal information. V2V safety messages cannot be private or confidential, otherwise cars might crash into each other. Data are in the open. No one owns or controls the data. Under such circumstances, how can we define digitally implementable privacy?

IoT/VANET researchers often define privacy in terms of mathematical anonymity [4]. Suppose there are k, cars. One of the cars is driven by person, p, but no one can be sure which of the k cars is driven by p. That's called k-anonymity [5]. Person, p, neither controls nor limits access to data, rather p's data are anonymized among k other drivers.

My most important research contributions so far include definitions and metrics for digital privacy. The concept of k-anonymity works well for measuring digital privacy of single data sets, but many similar data sets repeated rapidly can be used to reduce k to its minimum, 1, i.e. no privacy at all. See Fig. 3. But privacy elimination does not occur immediately. It takes time. Contemporary privacy definitions must consider not only anonymity, but also time of anonymity. In the case of location privacy definitions must account for distance, too. My recent research addresses these problem with a definition of and metrics for continuous network location privacy [6].

Other research accomplishments include:

My work gets cited by other researchers. My top paper has been cited 11 times in just one year. My overall citations have doubled in the past year. See Google Scholar.

My short term research plans call for collaboration with University of Michigan Transportation Research Institute (UMTRI) and a grant from the National Science Foundation (NSF). The goal of my forthcoming research is to take my work to the next level: from simulation to the real world. UMTRI collects real world transportation data, and UMTRI representatives have already agreed to let me run my privacy algorithms on their servers. Funding for this project will hopefully come from an NSF grant currently in the proposal writing process.


  1. Marjani, M., Nasaruddin, F., Gani, A., Karim, A., Hashem, I. A. T., Siddiqa, A., & Yaqoob, I. (2017). Big IoT Data Analytics: Architecture, Opportunities, and Open Research Challenges. IEEE Access, 5, 5247-5261.
  2. Fried, C., & Schoeman, D. F. (1984). Philosophical dimensions of privacy: An anthology. New York: Cambridge University Press, 1984, p. 209.
  3. Moor, J. H. (1997). Towards a theory of privacy in the information age. ACM SIGCAS Computers and Society, 27(3), 27-32.
  5. Samarati, P., & Sweeney, L. (1998). Protecting privacy when disclosing information: k-anonymity and its enforcement through generalization and suppression. Technical report, SRI International.
  6. Corser, G., Fu, H., and Banihani, A. (2016). Evaluating location privacy in vehicular communications and applications. IEEE Transactions on Intelligent Transportation Systems, 17(9), 2658-2667.
  7. States v. Jones, United 132 S.Ct. 945 (2012).