In mathematics and computer science, an adjacency matrix is a means of representing which vertices (or nodes) of a graph are adjacent to which other vertices. Another matrix representation for a graph is the incidence matrix.
Specifically, the adjacency matrix of a finite graph G on n vertices is the n × n matrix where the non-diagonal entry aij is the number of edges from vertex i to vertex j, and the diagonal entry aii, depending on the convention, is either once or twice the number of edges (loops) from vertex i to itself. Undirected graphs often use the latter convention of counting loops twice, whereas directed graphs typically use the former convention. There exists a unique adjacency matrix for each isomorphism class of graphs (up to permuting rows and columns), and it is not the adjacency matrix of any other isomorphism class of graphs. In the special case of a finite simple graph, the adjacency matrix is a (0,1)-matrix with zeros on its diagonal. If the graph is undirected, the adjacency matrix is symmetric.
The relationship between a graph and the eigenvalues and eigenvectors of its adjacency matrix is studied in spectral graph theory.
Read more about Adjacency Matrix: Examples, Adjacency Matrix of A Bipartite Graph, Properties, Variations, Data Structures
Famous quotes containing the word matrix:
“In all cultures, the family imprints its members with selfhood. Human experience of identity has two elements; a sense of belonging and a sense of being separate. The laboratory in which these ingredients are mixed and dispensed is the family, the matrix of identity.”
—Salvador Minuchin (20th century)