Thanks prof Breuer for reply. I am not concern about semantics,
What i mean is as follow:
1. When weights are assigned to the Formal grammar and these weights are taken from a certain algebraic structure. They represent a higher level of grammar. For example: Considering Context free grammar and their weights are chosen from a particular type of lattice, they can represent L={anbncn} although it is not a context free grammar.
2. Similarly in case of fuzzy automata weights are assigned using l-monoid or complete residuated lattice or using Gödel structure. In case of weighted automata weight are assigned from a ring which is more generic structure. Is there is some specific reason/benefit of using different type of algebraic structure in different situation in fuzzy automata.
By assigning weights or similar features to rules, one gains means of control over derivations. In a standard grammar, every terminating derivation generates a word. With weights one can filter out certain types of derivations.
For example the language a^n b^n c^n that you mention. After an initial rule S -> AC we have rules
A -> aAb with weight -3
C -> Cc with weight +3
A -> ab with weight +1
C -> c with weight +1
where the weigths are integers. If we only accept derivations with a total weight of 2 of all the rules that have been applied, we obtain the language a^n b^n c^n. If your weights come from structures that can do even more complicated things than adding and subtracting, you can also obtain more complicated languages,
Does this example answer your question? Weights are more common for automata than for grammars. But in principle they should be able to do about the same things in one mechanism and in the other.
Agree with Prof. Peter Leupold. By Adding weights, we can able to obtain complicated languages. Is negative number means something specific or it is just used to control the derived language. One more thing why we study different types of algebraic structures such as Integral lattice monoid, complete residuated lattice in the formal language. Is it just to obtain complicated languages or some benefits of one structure over the other. Thanks Prof. Peter Leupold, Peter T Breuer and Thomas Korimort
HI Peter. If the weights were smart enough, one could compute anything, yes. For example with Turing Machine configurations as weights and the operation of checking whether one is a predecessor of the other (I think). But this would not even be associative.
So usually one will be interested in simple grammars (regular or context-free) and restricted weight structures, at least semigroups. The exact acceptance conditions are then something one can play with...
I just used the negative numbers for convenience, because it was the example that came to my head first. In this way one application of the C rule can "annihilate" one application of the A rule. Other weight structures will work, too, maybe with different acceptance conditions.
I am not completely sure why people use the structures you mention, Maybe because of what I said earlier: the more restricted the structure you use, the stringer your result. So obtaining the same language with just some semigroup is easier than with e.g. a complete residuated lattice. Maybe the order in the lattice is used in a way like "rules can only be used with increasing weight" - but I do not really know.
Actually, I am not really using any framework. I have just made this up with a little experience from regulated rewriting and weighted automata. Most of the time, I think, weights are not used to generate or accept languages; rather they compute power series, that is (sets of) pairs [word, weight]. So there is no need for an acceptance condition. The most common language looked at in this context is the support, i.e. the language of all words that have non-zero weight.
Weighted automata most commonly work over a semiring. There you multiply weights along the computation, then you sum the weights of all possible computations of a given word to obtain the word's weight.