ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    The visual computer 12 (1996), S. 193-201 
    ISSN: 1432-2315
    Keywords: Key words: Symmetry detection ; Congruity problem ; Algorithm design ; Graph theory ; Computational geometry
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Notes: O (m 2) time and uses O(m) space, where m is the number of edges of the polyhedron. As this is the lower bound of the symmetry detection problem for the considered output form, our algorithm is optimal. We show that a slight modification of our symmetry detection algorithm can be used to solve the related conguity problem of polyhedra.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Electronic Resource
    Electronic Resource
    Springer
    The visual computer 12 (1996), S. 193-201 
    ISSN: 1432-2315
    Keywords: Symmetry detection ; Congruity problem ; Algorithm design ; Graph theory ; Computational geometry
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Notes: Abstract We propose a simple and efficient general algorithm for determining both rotational and involutional symmetries of polyhedra. It requiresO(m 2) time and usesO(m) space, wherem is the number of edges of the polyhedron. As this is the lower bound of the symmetry detection problem for the considered output form, our algorithm is optimal. We show that a slight modification of our symmetry detection algorithm can be used to solve the related conguity problem of polyhedra.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Electronic Resource
    Electronic Resource
    Springer
    Journal of mathematical imaging and vision 11 (1999), S. 27-43 
    ISSN: 1573-7683
    Keywords: Bayesian interpolation ; regularization ; hyperparameters
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract When interpolating incomplete data, one can choose a parametric model, or opt for a more general approach and use a non-parametric model which allows a very large class of interpolants. A popular non-parametric model for interpolating various types of data is based on regularization, which looks for an interpolant that is both close to the data and also “smooth” in some sense. Formally, this interpolant is obtained by minimizing an error functional which is the weighted sum of a “fidelity term” and a “smoothness term”. The classical approach to regularization is: select “optimal” weights (also called hyperparameters) that should be assigned to these two terms, and minimize the resulting error functional. However, using only the “optimal weights” does not guarantee that the chosen function will be optimal in some sense, such as the maximum likelihood criterion, or the minimal square error criterion. For that, we have to consider all possible weights. The approach suggested here is to use the full probability distribution on the space of admissible functions, as opposed to the probability induced by using a single combination of weights. The reason is as follows: the weight actually determines the probability space in which we are working. For a given weight λ, the probability of a function f is proportional to exp(− λ ∫ f2 uu du) (for the case of a function with one variable). For each different λ, there is a different solution to the restoration problem; denote it by fλ. Now, if we had known λ, it would not be necessary to use all the weights; however, all we are given are some noisy measurements of f, and we do not know the correct λ. Therefore, the mathematically correct solution is to calculate, for every λ, the probability that f was sampled from a space whose probability is determined by λ, and average the different fλ's weighted by these probabilities. The same argument holds for the noise variance, which is also unknown. Three basic problems are addressed is this work: • Computing the MAP estimate, that is, the function f maximizing Pr(f/D) when the data D is given. This problem is reduced to a one-dimensional optimization problem. • Computing the MSE estimate. This function is defined at each point x as ∫f(x)Pr(f/D) Ũf. This problem is reduced to computing a one-dimensional integral. In the general setting, the MAP estimate is not equal to the MSE estimate. • Computing the pointwise uncertainty associated with the MSE solution. This problem is reduced to computing three one-dimensional integrals.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...