Learning Combinatorial Optimzation: Difference between revisions

From statwiki
Jump to navigation Jump to search
No edit summary
No edit summary
Line 4: Line 4:
Roles  
Roles  
Abhi (Graph Theory),
Abhi (Graph Theory),
Alvin (Reinforcement Learning/actual paper)
Alvin (Reinforcement Learning/actual paper)
Pranav (actual paper),
Pranav (actual paper),
Daniel (Conclusion: performance, adv, disadv, criticism)
Daniel (Conclusion: performance, adv, disadv, criticism)


Line 14: Line 17:
Minimum Vertex Cover: Given a ‘graph’ G, find the minimum number of vertices to tick, so that every single edge is covered. G=(V,E,w).
Minimum Vertex Cover: Given a ‘graph’ G, find the minimum number of vertices to tick, so that every single edge is covered. G=(V,E,w).
              Where G is the Graph, V are the vertices, E is the edge, and w is the set of weights for the edges
              Where G is the Graph, V are the vertices, E is the edge, and w is the set of weights for the edges
Maximum Cut: Given a ‘graph’ G,
Maximum Cut: Given a ‘graph’ G,
Travelling Salesman Problem
Travelling Salesman Problem



Revision as of 20:00, 19 March 2018

Learning Combinatorial Optimization Algorithms Over Graphs


Roles Abhi (Graph Theory),

Alvin (Reinforcement Learning/actual paper)

Pranav (actual paper),

Daniel (Conclusion: performance, adv, disadv, criticism)

Intro

1) Graph Theory (MLP, TSP, Maxcut) - Common Problems to Solve are: Minimum Vertex Cover: Given a ‘graph’ G, find the minimum number of vertices to tick, so that every single edge is covered. G=(V,E,w). Where G is the Graph, V are the vertices, E is the edge, and w is the set of weights for the edges

Maximum Cut: Given a ‘graph’ G,

Travelling Salesman Problem

2) Reinforcement Learning -


Actual Paper:


Conclusions (Performance, advantages, disadvantages): A3C? S2V?


Criticism: