To complete the Master of Science in Computer Engineering, each student must complete a minimum of 30 credit hours, or 10 courses.
Coursework varies based on student’s choosing a thesis or non-thesis degree track.
Updated 1/17/2024
Coursework
Thesis Option
Non-Thesis Option
Computer Engineering Required Courses
9 credits
9 credits
Computer Engineering Area Courses
9 credits
9 credits
Computer Engineering Electives
3 credits
12 credits
Independent Study
3 credits
NA*
Thesis
6 credits
NA
Total Credit Hours
30 credits
30 credits
Note: students must complete ECE 9030 before applying for the thesis option. Thesis applications also require a written research proposal and recommendation and approval from the student's research advisor and the department chairperson. Students who qualified for the thesis option are required to make an oral presentation prior to graduation.
* Graduate students electing the non-thesis option may substitute three credits of independent study for one elective course.
Degree Plan
Graduate students must submit a degree plan by midterm of their first semester. Five-year students must submit a plan with their 5-year program application. The plan must be approved by an MS CPE academic advisor. Changes to your degree plan can be made by submitting an updated form, approved by the advisor.
Computer components, subsystems, and their interaction. Instruction sets, central processing units, microprogramming, intersystem communications, interrupts, DMA, memory hierarchy, and operating system demands on hardware. Prerequisite: Undergraduate background in digital systems (equivalent to ECE 2042).
Credit Hours:
3
Last Offered:
Spring 2025, Spring 2023, Spring 2022, Spring 2021
Programming using the UNIX operating system, shells, utilities, and C. Emphasis on standards including the ISO/IEC C standard and the POSIX/IEEE Open Group Single Unix Specification.
Computer components, subsystems, and their interaction. Instruction sets, central processing units, microprogramming, intersystem communications, interrupts, DMA, memory hierarchy, and operating system demands on hardware. Prerequisite: Undergraduate background in digital systems (equivalent to ECE 2042).
Credit Hours:
3
Last Offered:
Spring 2025, Spring 2023, Spring 2022, Spring 2021
Topics include: concepts in nomadic computing and mobility; challenges in design and deployment of wireless and ad hoc networks; MAC issues, routing protocols and mobility management for ad hoc networks and networks of the future. Prerequisites: ECE 4470 or equivalent.
Hardware security topics including embedded systems security hardware Trojans, security in implantable medical devices, security in RFID/NFC, protection from side channel attacks, tamper resistance and crypto processor design, trusted FPGA design/JTAG, hardware-based cryptanalysis.
Hardware system design and modeling including synchronous design techniques, modeling of systems and subsystems at various levels of detail, and the use of hardware descriptive languages. Presentation of a schematic capture tool, VHDL modeling and simulation, and SPICE simulation.
A hands-on course on software and architecture aspects of embedded systems. Topics include: embedded processor architecture, software architecture and development, communicating with I/O devices, firmware and operating systems, buses and embedded networks, memory technology and design, and low power design.
Credit Hours:
3
Last Offered:
Spring 2025, Spring 2024, Spring 2023, Spring 2022
This course examines common low-level software vulnerabilities that take advantage of current system architectures. Mitigation strategies at the software level and the system level will be discussed and analyzed
Introduces students to advanced digital design and implementation using FPGAs (Field-Programmable Gate Arrays). Topics include VHDL & Verilog, FPGA architectures, programming technologies, design methodologies, simulation and synthesis, place and route, and timing analysis, which board and EDA tools are used to help students gain hands-on experience. Prerequisite: Digital/logic design and VHDL basics.
Basic foundation of the post-quantum crypotographic engineering and recent advances in the field; introduces design and implementation techniques for the arithmetic unit and overall post-quantum cryptography on both hardware and software platforms, and side-channel attack skills.
Advanced Machine Learning covers three main areas: basic algorithmic foundations such as linear regression and neural networks, applications of machine learning in image classification and natural language processing, and hardware acceleration of machine learning using GPUs and customized silicon (e.g., TPU).
Any course from the area courses above may also count as an elective. At least two of the electives must be ECE courses. Courses not listed here may count as electives with approval of the advisor.
Theory and practice of computer communications security, including cryptography, authentication, and secure electronic mail. Topics include secret and public key cryptography; message digests; password-based, address- based, and cryptographic authentication; privacy and authentication in email; PEM, PGP, and S/MIME. Use of various algorithms.
Malware and cyber threats: computer network defense; software for Data Protection and Privacy, Security Information and Event Management (SIEM), Governance, Risk and Compliance (GRC); trusted computer systems and secure applications; identy and access management including biometrics; next generation security concepts.
Credit Hours:
3
Last Offered:
Spring 2025, Spring 2024, Spring 2023, Spring 2022
Security risks of critical infrastructure systems such as electrical, pipelines, water, and transportation. Design and setup of Supervisory Control and Data Acquisition (SCADA) systems, Distributed Control Systems (DCS), and Programmable Logic Controller (PLC) systems. Security challenges and defense-in-depth methodology. Hands-on lab experiments.
Quantifying security in an unambiguous way using the Trusted System Evaluation Criteria. "Hacking" a system, developing and implementing countermeasures and threat removal, techniques for Access control, confidentiality, etc. Secure the network, web, enterprise and database, the Cloud and the Semantic Web.
Provides a technical analysis of distributed ledger technology (DLT) and application areas. Learn the process of mining and signing blocks using Proof of Work and Proof of Stake. Analyze problems best suited for public and and permissioned blockchains for distributed applications.
Credit Hours:
3
Last Offered:
Spring 2025, Spring 2024, Spring 2022, Spring 2021
Security requirements and design principles for secure software development. Security issues in current applications, database systems and web systems. Identifying vulnerabilities, their impact, and solutions to securing them.
Prerequisites:
ECE 8484
Credit Hours:
3
Last Offered:
Spring 2024, Spring 2023, Spring 2022, Spring 2021
US and relevant international health and data security and privacy laws/regulations, HIPAA and HITECH compliance for EHR software and medical devices, federal and state patient privacy and health data access rights, electronic transmission of health data, health insurance, FDA rules and regulations, unauthorized access, vulnerabilities, unsecured wireless access, inadequate encryption, authentication failures, and other access control vulnerabilities, security risk assessment, privacy and security gaps in health information exchanges, federal and state privacy breach notification laws and related civil and criminal penalities, and successful security compliance audit and management strategies.
Fundamental strategies for algorithm design; mathematical and empirical techniques for analysis of nonrecursive and recursive algorithms, with applications such as sorting, searching, string processing and graphs; NP-complete problems and approximation algorithms.
Organization, characteristics, constructs, and design principles of programming languages; syntax, semantics, and pragmatics; language implememtation issues; different programming paradigms such as imperative, functional, object-oriented, and logic programming.
Modern database systems, including relational and NoSQL systems. Emphasize practical knowledge while covering the essential theory design; query lanquages; security; transactions. Focus on both theory and practice.
Automata theory: deterministic and non-deterministic finite automata, pushdown automata, regular languages, context-free grammars, pumping lemma. Computability and recursion theory: Turing machines and their variations, decidability and recursive enumerability, mapping reducibility and Turing reducibility, undecidability of the halting problem, logical theories and Godel's incompleteness theorem. Complexity theory: time complexity, space complexity, major open problems on computational complexity. Corequisite: CSC 8301 or degree program in mathematics.
Prerequisites:
Credit Hours:
3
Last Offered:
Spring 2025, Spring 2024, Spring 2023, Spring 2022
Study of algorithms and systems that can learn without being explicitly programmed. Topics include: clustering, classification, prediction, supervised learning, unsupervised learning, decision trees, support vector machines, random forests, regression, dimensionality reduction, neural networks, deep learning, and probabilistic graphical models.
An introduction to software engineering covering development life cycle models, requirements analysis and specification design concepts and methods, testing, maintenance, CASE tools and management concerns. Additional topics may include reuse metrics, experimentation, reengineering, development environments, and standards. The student may be required to write a research paper and/or give an in-class presentation.
The advancement of embedded processes and sensor networks that have made the IOT feasible. Topics include: Introduction, Domains of application, IOT VS M2M, IOT Management, Protocols, Design Methodologies, Hands on Design using Raspberry Pi and Python, Reviewing servicers and clouds, and data analytics.
Divisibility; Euclidean algorithm; prime numbers; Fundamental Theorem of Arithmetic; congruences; arithmetic functions; Diophantine equations, additional topics, which may vary by semester, include cryptography, law of quadratic reciprocity, continued fractions.
Model construction, Markov chains, game theory, networks and flows, growth processes and models for epidemics and queues with an emphasis on model construction.
Credit Hours:
3
Last Offered:
Summer 2023, Summer 2021, Spring 2020, Summer 2018
An investigation of a current research topic under the direction of a faculty member. A written report is required. Needs Chairperson's Permission to register for course.
Independent student investigation of electrical engineering or computer engineering problem under the supervision of a faculty advisor; a written comprehensive report embodying the results of the project is required. Prerequisite: Consent of Chairperson