admin管理员组文章数量:1416651
I am trying to learn BFS and trying to analyse BFS time complexity on my own. But I could not wrap my head around it. Below is my implementation of Graph data structure and BFS algorithm. Is the time complexity here really O(Vertex + Edges) in my implementation. If not, what am I missing here in terms of efficiency time complexity.
public class Graph {
private final Map<Vertex, List<Vertex>> nodes;
public Graph() {
this.nodes = new HashMap<>();
}
public void addEdge(Vertex source, Vertex target) {
List<Vertex> neighbours = getEdges(source);
if(!neighbours.contains(target))
neighbours.add(target);
this.nodes.put(source, neighbours);
}
public List<Vertex> getEdges(Vertex source) {
return this.nodes.getOrDefault(source, new LinkedList<>());
}
}
public class BreadthFirstSearch {
private final Graph graph;
private final Queue<Vertex> queue;
private final Set<Vertex> visited;
public BreadthFirstSearch(Graph graph) {
this.graph = graph;
this.queue = new LinkedList<>();
this.visited = new HashSet<>();
}
public void doBFS() {
this.graph.getNodes().forEach((key, value) -> {
this.queue.add(key);
this.queue.addAll(value);
while(!this.queue.isEmpty()) {
Vertex vertex = this.queue.poll();
if(this.visited.contains(vertex)) {
continue;
}
System.out.print(vertex + " -> ");
visited.add(vertex);
}
});
}
}
I am trying to learn BFS and trying to analyse BFS time complexity on my own. But I could not wrap my head around it. Below is my implementation of Graph data structure and BFS algorithm. Is the time complexity here really O(Vertex + Edges) in my implementation. If not, what am I missing here in terms of efficiency time complexity.
public class Graph {
private final Map<Vertex, List<Vertex>> nodes;
public Graph() {
this.nodes = new HashMap<>();
}
public void addEdge(Vertex source, Vertex target) {
List<Vertex> neighbours = getEdges(source);
if(!neighbours.contains(target))
neighbours.add(target);
this.nodes.put(source, neighbours);
}
public List<Vertex> getEdges(Vertex source) {
return this.nodes.getOrDefault(source, new LinkedList<>());
}
}
public class BreadthFirstSearch {
private final Graph graph;
private final Queue<Vertex> queue;
private final Set<Vertex> visited;
public BreadthFirstSearch(Graph graph) {
this.graph = graph;
this.queue = new LinkedList<>();
this.visited = new HashSet<>();
}
public void doBFS() {
this.graph.getNodes().forEach((key, value) -> {
this.queue.add(key);
this.queue.addAll(value);
while(!this.queue.isEmpty()) {
Vertex vertex = this.queue.poll();
if(this.visited.contains(vertex)) {
continue;
}
System.out.print(vertex + " -> ");
visited.add(vertex);
}
});
}
}
Share
Improve this question
edited Feb 3 at 2:24
Bala Ji
asked Feb 3 at 0:42
Bala JiBala Ji
538 bronze badges
8
|
Show 3 more comments
2 Answers
Reset to default 1You have not implemented a BFS search as you do not iteratively add vertices to the queue as they are reached - you only ever add the vertices and their immediately adjacent neighbours with each pass of the outer loop and that is not a BFS.
In Java:
HashMap.foreach
has a time complexity ofO(buckets + n)
- for a fixed number of hash buckets, this is effectivelyO(n)
.Collection.add
has a time complexity ofO(1)
.Collection.addAll
has a time complexity ofO(n)
.LinkedList.poll
has a time complexity ofO(1)
.Collection.isEmpty
has a time complexity ofO(1)
.HashSet.contains
has a worst-case time complexity ofO(log(n))
(orO(n)
if you are using Java 7 or below). Assuming that your objects hash with an even distribution and the number of items is not orders of magnitude greater than the number of buckets used by theHashSet
then the average time-complexity can be considered to beO(1)
.
So each line of your code has the time complexity:
this.graph.getNodes().forEach((key, value) -> { // O(|V|)
this.queue.add(key); // O(1)
this.queue.addAll(value); // O(|vertex.adjacent|)
while(!this.queue.isEmpty()) { // O(1)
Vertex vertex = this.queue.poll(); // O(1)
if(this.visited.contains(vertex)) { // O(log(|V|)) worst-case
continue; // O(1)
} //
System.out.print(vertex + " -> "); // O(1)
visited.add(vertex); // O(1)
} //
}); //
In this case, using addAll
is not actually a problem because, for each vertex, you are adding each adjacent vertex, which is equivalent of iterating over the edges in a single direction. So aggregated over the entirety of the outer loop you are going to add each edge's adjacent vertex in each direction so this is O(2E)
, or simply O(E)
.
Where you do have an issue is that checking if a vertex is visited is O(log(n))
worst-case time complexity and that is done for each vertex each time you want to add it to the queue. Since you check for each adjacent vertex of each edge then over the entire process this is O(E.log(V))
worst-case time-complexity (and O(E)
average time-complexity, assuming the previously mentioned preconditions hold).
Your entire algorithm is O(V + E.log(V))
worst-case time-complexity (or O(V + E.V)
worst-case time complexity if you are using Java 7 or earlier) and O(V + E)
best-case/average time-complexity (again, assuming the previous preconditions are met).
To fix the worst-case time complexity issues:
- You could add a
visited
flag to theVertex
object and set the flag on the object or, if your vertices each have a unique sequential numerical index (i.e. they are numbered0 .. |V|-1
), you could have an array of boolean flags on theBreadthFirstSearch
object corresponding to each vertex and update the flag in the array. Either way you could update the flags inO(1)
time. - You can stop using a
HashMap
to store the adjacency lists and, instead, store the lists as an attribute on each vertex.
Something like:
public class BreadthFirstSearch {
private final Graph graph;
private final Queue<Vertex> queue;
private final boolean[] visited;
public BreadthFirstSearch(Graph graph) {
this.graph = graph;
this.queue = new LinkedList<>();
this.visited = new boolean[graph.nodes.size()];
}
private void enqueue(final Vertex vertex)
{
int index = vertex.index;
if (!this.visited[index])
{
this.queue.add(vertex);
}
}
public void doBFS() {
this.graph.getNodes().forEach((key, value) -> {
this.enqueue(key);
while(!this.queue.isEmpty()) {
final Vertex vertex = this.queue.poll();
System.out.print(vertex + " -> ");
LinkedList<Vertex> neighbours = vertex.getAdjacent();
for (final Vertex adjacent: neighbours)
{
this.enqueue(adjacent);
}
}
});
}
}
This assumes that you can write a vertex.getAdjacent
method that has an O(1)
worst-case time complexity. With your current implementation using a HashMap
to store relationships this is unlikely as HashMap.get
has O(log(n)
worst-case time complexity. Instead, you may need to look a storing the adjacent edges as an attribute on each Vertex
instance rather than in a HashMap
on the graph.
The time complexity of BFS is O(V + E). However, I have a few parts that I don't understand about your code. In your BreadthFirstSearch#doBFS
, you add the node and all the nodes that node can reach for all nodes. You shouldn't be looping over each node in BFS; you just need to add the first node to the queue and while the queue isn't empty, process the node.
public void doBFS(Vertex start) {
queue.add(start);
while (!queue.isEmpty()) {
Vertex current = queue.poll();
System.out.println("vertex " + current + " visited");
for (Vertex target : graph.getNodes().getOrDefault(current, List.of())) {
if (!this.visited.contains(target)) {
queue.add(target);
}
}
}
}
本文标签: javaHow efficient is the below BFS implementationStack Overflow
版权声明:本文标题:java - How efficient is the below BFS implementation? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1745253086a2649939.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
this.graph = graph
You are storing two copies of your graph? That is going to be horribly memory inefficient. – ravenspoint Commented Feb 3 at 0:47this.graph = graph
will not create two copies of graph. It will simply create an additional reference which points to the same underlying object. I agree with you on the memory efficiency, but here my primary purpose is to understand how the time complexity of BFS is O(V + E) – Bala Ji Commented Feb 3 at 1:02