Easy access to the internet has led to a steep increase in the amount of comments written by non-expert writers. While such user comments may provide useful information about the target of discussion, readers are left with a burden of delving through piles of uninformative text to find the information they need. One popular approach to deal with this problem is to build systems that can aid the readers by means of extracting and summarizing available information or recommending well-written comments. We, however, consider the problem from the perspective of the commenters: Can we build a system that can guide the commenters to write comments with better argument structures? In this talk, I will present core components of an automated system to assist commenters in constructing better-formed arguments in their comments. These include: (1) A monological argumentation model to capture the evaluability of arguments in online setting, (2) A classifier for determining appropriate types of support for propositions comprising online user comments, and (3) A classifier for identifying support relations among those propositions. I will also discuss how this system is applicable to broader domains such as automated essay grading and recommendation systems.