Academic
Publications
Probabilistic finite state automata-part I

Probabilistic finite state automata-part I,IEEE Transactions on Pattern Analysis and Machine Intelligence,E. Vidal,F. Thollard,C. De La Higuera,F. Cas

Cumulative Annual
    • ...This approach aims at avoiding the problem by imposing a different bias: the data come from a distribution, itself represented by a stochastic automaton [16]...

    Frédéric Tantiniet al. Identification in the Limit of Systematic-Noisy Languages

    • ...Alternatively one can define distances between distributions; a survey can be found in [19]...

    Colin De La Higuera. Ten Open Problems in Grammatical Inference

    • ... As we will prove in the second part of our paper [40],...
    • ...Part II [40] of the paper will be devoted to the comparison with other types of models, learning issues, and the presentation of some of the extensions of the probabilistic automata...
    • ...The cases that do not fit in this definition will be analyzed in the second part of our paper [40]...
    • ...All this will be done in Part II of this paper [40]...

    Enrique Vidalet al. Probabilistic Finite-State Machines-Part I

    • ...N the first part [1] of this survey, we introduced probabilistic finite-state automata (PFA), their deterministic counterparts (DPFA), and the properties of the distributions these objects can generate...
    • ...Among the models proposed so far, some are based on acyclic automata [1], [16], [17], [18], [19]...
    • ...This distribution (which is similar to that used in Part I [1] to prove that the mean of two deterministic distributions may not be deterministic) is exactly generated by the PFA shown in Fig. 3 (left)...
    • ...Maximizing the likelihood is equivalent to minimizing the empirical cross entropy ^ X S;DAÞ (see Section 6 of [1])...
    • ...The parameter updating is based on the forward and backward dynamic programming recurrences to compute the probability of a string discussed in Section 3 of [1]...
    • ...Using the optimal path (Viterbi) approximation rather than the true (forward) probability (see [1], Section 3.2, and Section 3.1, respectively) in the function to be optimized (8), a simpler algorithm is obtained, called the Viterbi reestimation algorithm...
    • ...1. We studied in the section concerning topology of part I [1] the questions of computing the distances between two distributions represented by PFA .I n the case where the PFA are DPFA the computation of the L2 distance and of the Kullback-Leibler divergence can take polynomial time, but what about the L1, L1, and logarithmic distances? 2. In the same trend, it is reasonably clear that, if at least one of the distributions is represented ...
    • ...5. We have provided a number of results on distances in the section concerning distances of part I [1]...
    • ...Proof. By Proposition 11 of [1], D can be generated by a PFA with a single initial state...
    • ...which, according to (1) in Section 2.6 of [1] (and noting that, in our PFA, Iðq0 Þ¼ 1), is the probability of the path for x in A y is associated with...
    • ...Finally, following (2) in Section 2.6 of [1] (that gives the probability of generating a string),...
    • ...For each path in M, there is one and only one path in A 0 . Moreover, by construction, Iðs1 Þ¼ Iðs1Þ and P ðq; a; q 0 Þ¼ Eðq; a Þ� Tðq; q 0 Þ; therefore, DA0 ¼D M. Finally, by Proposition 11 of [1], we can build a PFA A, with at most jQ j¼ n states, such that DA0 ¼D A. t...

    Enrique Vidalet al. Probabilistic finite-state machines - part II

    • ...A number of ideas have been developed in the literature to deal with these problems [DA00,Tho01]...
    • ...Smoothing techniques [Tho01] are developed in order to cope with parsing failures by an Sdfa...

    Colin De La Higueraet al. Learning Stochastic Finite Automata for Musical Style Recognition

Sort by: