<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
<title>XIII Jornadas de Cloud Computing, Big Data &amp; Emerging Topics</title>
<link href="http://sedici.unlp.edu.ar:80/handle/10915/182425" rel="alternate"/>
<subtitle/>
<id>http://sedici.unlp.edu.ar:80/handle/10915/182425</id>
<updated>2026-03-11T01:09:37Z</updated>
<dc:date>2026-03-11T01:09:37Z</dc:date>
<entry>
<title>Implementation of a smart electricity meter</title>
<link href="http://sedici.unlp.edu.ar:80/handle/10915/189271" rel="alternate"/>
<author>
<name>Manzin, Matías Federico</name>
</author>
<author>
<name>Navarria, Leonardo José</name>
</author>
<author>
<name>Medina, Santiago</name>
</author>
<author>
<name>Libutti, Leandro Ariel</name>
</author>
<author>
<name>Montezanti, Diego Miguel</name>
</author>
<id>http://sedici.unlp.edu.ar:80/handle/10915/189271</id>
<updated>2026-03-05T14:27:28Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Objeto de conferencia
XIII Jornadas de Cloud Computing, Big Data &amp; Emerging Topics (La Plata, 24 al 26 de junio de 2025)
In recent years, the economic and ecological impact of energy consumption in public and private institutions has increased, making it necessary to have computer and electronic systems that allow monitoring energy expenditure in the different areas of an institution. Currently, smart buildings have automated systems that include sensor nodes that allow the monitoring and control of environmental parameters, facilitating the goal of achieving energy-efficient, comfortable, and cost-effective environments. In this context, this work presents the design and development of a smart energy consumption meter that can be integrated into any control and monitoring system without the need for significant changes, thanks to its modular implementation. The hardware architecture is presented, focusing on the main functionality of each module and the interconnection and communication between them. In addition, the calibration tests performed to adjust the measurement parameters and obtain acceptable relative measurement errors are described. Tests performed with the measuring node on a heterogeneous architecture computer, comprising CPU and GPU cores and running a computationally intensive algorithm, show that the accumulated power decreases as the amount of resources used increases, validating the measuring capability of the developed system.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
<dc:description>In recent years, the economic and ecological impact of energy consumption in public and private institutions has increased, making it necessary to have computer and electronic systems that allow monitoring energy expenditure in the different areas of an institution. Currently, smart buildings have automated systems that include sensor nodes that allow the monitoring and control of environmental parameters, facilitating the goal of achieving energy-efficient, comfortable, and cost-effective environments. In this context, this work presents the design and development of a smart energy consumption meter that can be integrated into any control and monitoring system without the need for significant changes, thanks to its modular implementation. The hardware architecture is presented, focusing on the main functionality of each module and the interconnection and communication between them. In addition, the calibration tests performed to adjust the measurement parameters and obtain acceptable relative measurement errors are described. Tests performed with the measuring node on a heterogeneous architecture computer, comprising CPU and GPU cores and running a computationally intensive algorithm, show that the accumulated power decreases as the amount of resources used increases, validating the measuring capability of the developed system.</dc:description>
</entry>
<entry>
<title>No Latency, No Waste: How Fog Computing Optimizes Precision Agriculture</title>
<link href="http://sedici.unlp.edu.ar:80/handle/10915/182615" rel="alternate"/>
<author>
<name>Gonçalves, Ricardo</name>
</author>
<author>
<name>Rossi, Gustavo Héctor</name>
</author>
<id>http://sedici.unlp.edu.ar:80/handle/10915/182615</id>
<updated>2025-08-13T20:02:19Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Objeto de conferencia
XIII Jornadas de Cloud Computing, Big Data &amp; Emerging Topics (La Plata, 24 al 26 de junio de 2025); 13th Conference on Cloud Computing, Big Data &amp; Emerging Topics (JCC-BD&amp;ET 2025)
The implementation of Internet of Things (IoT) technologies in precision agriculture has been revolutionizing crop resource management. Soil sensors continuously monitor moisture, temperature, and salinity, providing crucial data for precise irrigation and soil management, while weather sensors track atmospheric conditions including air temperature, humidity, precipitation, and wind speed. This data enables prediction and management of pests and diseases, while also informing harvest decisions. Continuous monitoring and advanced data analysis allow for identification of trends and anomalies, facilitating rapid and precise adjustments to agricultural operations. In this article we propose an approach to deal with the problem of latency in the use of IoT in remote areas.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
<dc:description>The implementation of Internet of Things (IoT) technologies in precision agriculture has been revolutionizing crop resource management. Soil sensors continuously monitor moisture, temperature, and salinity, providing crucial data for precise irrigation and soil management, while weather sensors track atmospheric conditions including air temperature, humidity, precipitation, and wind speed. This data enables prediction and management of pests and diseases, while also informing harvest decisions. Continuous monitoring and advanced data analysis allow for identification of trends and anomalies, facilitating rapid and precise adjustments to agricultural operations. In this article we propose an approach to deal with the problem of latency in the use of IoT in remote areas.</dc:description>
</entry>
<entry>
<title>Interpretable Machine Learning for Real Estate Valuation: A Case Study with Small Data</title>
<link href="http://sedici.unlp.edu.ar:80/handle/10915/182614" rel="alternate"/>
<author>
<name>Gutiérrez, Emiliano</name>
</author>
<author>
<name>López del Río, Lorena Caridad</name>
</author>
<author>
<name>Caridad Ocerín, Jose María</name>
</author>
<id>http://sedici.unlp.edu.ar:80/handle/10915/182614</id>
<updated>2025-08-13T20:02:20Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Objeto de conferencia
XIII Jornadas de Cloud Computing, Big Data &amp; Emerging Topics (La Plata, 24 al 26 de junio de 2025); 13th Conference on Cloud Computing, Big Data &amp; Emerging Topics (JCC-BD&amp;ET 2025)
This study explores residential property valuation in Seville, Spain, using interpretable machine learning techniques on a small dataset of 1701 sales ads of apartments collected online. Unlike conventional approaches that rely on large datasets, our research addresses the unique challenges of small data samples while maintaining model interpretability.&#13;
We compare traditional hedonic linear regression with Random Forest algorithms. The results provide actionable insights for real estate stakeholders in medium-sized urban markets, bridging the gap between econometric tradition and machine learning innovation.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
<dc:description>This study explores residential property valuation in Seville, Spain, using interpretable machine learning techniques on a small dataset of 1701 sales ads of apartments collected online. Unlike conventional approaches that rely on large datasets, our research addresses the unique challenges of small data samples while maintaining model interpretability.&#13;
We compare traditional hedonic linear regression with Random Forest algorithms. The results provide actionable insights for real estate stakeholders in medium-sized urban markets, bridging the gap between econometric tradition and machine learning innovation.</dc:description>
</entry>
<entry>
<title>Internet deception to share IoC</title>
<link href="http://sedici.unlp.edu.ar:80/handle/10915/182613" rel="alternate"/>
<author>
<name>Maddalena Kreff, Pablo Germán</name>
</author>
<author>
<name>Venosa, Paula</name>
</author>
<author>
<name>Bazán, Patricia Alejandra</name>
</author>
<id>http://sedici.unlp.edu.ar:80/handle/10915/182613</id>
<updated>2025-08-13T20:02:21Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Objeto de conferencia
XIII Jornadas de Cloud Computing, Big Data &amp; Emerging Topics (La Plata, 24 al 26 de junio de 2025); 13th Conference on Cloud Computing, Big Data &amp; Emerging Topics (JCC-BD&amp;ET 2025)
The detection of cybersecurity attacks through the collection and analysis of information is a challenge that focuses, in this work, on the use of honeypots, which are decoys that allow attacks to be studied in a controlled environment. The data collected can be used as a source of information in Cyber Threat Intelligence (CTI). Cyber Deception is a form of deception that exploits digital tools to deceive, manipulate or confuse a target, where the value lies in being attacked and investigated. Thus, honeypots constitute a Cyber Deception mechanism. Cybersecurity frameworks provide a reference model for the analysis of attacks, favoring their classification and understanding in order to mitigate them. Furthermore, these frameworks help to understand the stages of attacks linked to Cyber Deception mechanisms, including honeypots. The aim of the work is to analyze the communication mechanisms between honeypots and CTI platforms, with the aim of improving the cybersecurity strategies of organizations.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
<dc:description>The detection of cybersecurity attacks through the collection and analysis of information is a challenge that focuses, in this work, on the use of honeypots, which are decoys that allow attacks to be studied in a controlled environment. The data collected can be used as a source of information in Cyber Threat Intelligence (CTI). Cyber Deception is a form of deception that exploits digital tools to deceive, manipulate or confuse a target, where the value lies in being attacked and investigated. Thus, honeypots constitute a Cyber Deception mechanism. Cybersecurity frameworks provide a reference model for the analysis of attacks, favoring their classification and understanding in order to mitigate them. Furthermore, these frameworks help to understand the stages of attacks linked to Cyber Deception mechanisms, including honeypots. The aim of the work is to analyze the communication mechanisms between honeypots and CTI platforms, with the aim of improving the cybersecurity strategies of organizations.</dc:description>
</entry>
<entry>
<title>Ensuring Quality in the OECD AI Lifecycle Through ISO/IEC Standards</title>
<link href="http://sedici.unlp.edu.ar:80/handle/10915/182612" rel="alternate"/>
<author>
<name>Torres, Juan Ignacio</name>
</author>
<author>
<name>Pasini, Ariel Cristian</name>
</author>
<author>
<name>Pesado, Patricia Mabel</name>
</author>
<id>http://sedici.unlp.edu.ar:80/handle/10915/182612</id>
<updated>2025-08-13T20:02:21Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Objeto de conferencia
XIII Jornadas de Cloud Computing, Big Data &amp; Emerging Topics (La Plata, 24 al 26 de junio de 2025); 13th Conference on Cloud Computing, Big Data &amp; Emerging Topics (JCC-BD&amp;ET 2025)
The integration of artificial intelligence (AI) technologies within organizations presents both significant opportunities and complex challenges. To manage this complexity, ISO/IEC standards provide a structured framework for the adoption and management of AI systems throughout their lifecycle. This article explores the role of ISO/IEC standards in ensuring the quality, security, and ethical alignment of AI systems, based on the lifecycle framework defined by the Organisation for Economic Co-operation and Development (OECD). The paper outlines how these standards support AI system development, from planning and design through deployment and monitoring, addressing critical issues such as governance, data quality, bias detection, and system reliability. A comprehensive quality model is proposed, drawing on ISO/IEC 25058 and 25059 standards, to assess the effectiveness and transparency of AI systems in real-world environments.&#13;
The adoption of these standards is shown to enhance corporate reputation, improve regulatory compliance, and mitigate risks, positioning organizations to leverage AI technologies responsibly and efficiently.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
<dc:description>The integration of artificial intelligence (AI) technologies within organizations presents both significant opportunities and complex challenges. To manage this complexity, ISO/IEC standards provide a structured framework for the adoption and management of AI systems throughout their lifecycle. This article explores the role of ISO/IEC standards in ensuring the quality, security, and ethical alignment of AI systems, based on the lifecycle framework defined by the Organisation for Economic Co-operation and Development (OECD). The paper outlines how these standards support AI system development, from planning and design through deployment and monitoring, addressing critical issues such as governance, data quality, bias detection, and system reliability. A comprehensive quality model is proposed, drawing on ISO/IEC 25058 and 25059 standards, to assess the effectiveness and transparency of AI systems in real-world environments.&#13;
The adoption of these standards is shown to enhance corporate reputation, improve regulatory compliance, and mitigate risks, positioning organizations to leverage AI technologies responsibly and efficiently.</dc:description>
</entry>
<entry>
<title>Development of a Hand Motion Sensing Glove for Exergames: Design Evolution and Future Perspectives</title>
<link href="http://sedici.unlp.edu.ar:80/handle/10915/182611" rel="alternate"/>
<author>
<name>Herrera del Gener, Aldana Mariel</name>
</author>
<author>
<name>Sanz, Cecilia Verónica</name>
</author>
<id>http://sedici.unlp.edu.ar:80/handle/10915/182611</id>
<updated>2025-08-13T20:02:23Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Objeto de conferencia
XIII Jornadas de Cloud Computing, Big Data &amp; Emerging Topics (La Plata, 24 al 26 de junio de 2025); 13th Conference on Cloud Computing, Big Data &amp; Emerging Topics (JCC-BD&amp;ET 2025)
Exergames have gained popularity as interactive systems that promote physical activity through gaming. This paper presents the development process of a hand motion-sensing glove designed to complement an existing ankle-worn motion sensor. The glove aims to provide a more immersive exergaming experience by accurately tracking hand movements. Several prototypes were developed and iteratively improved to refine the design and functionality. This paper outlines the background of exergaming technology, reviews related work, and details the iterative development of the glove, discussing challenges encountered and improvements made. Finally, future steps for completing the final prototype are discussed.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
<dc:description>Exergames have gained popularity as interactive systems that promote physical activity through gaming. This paper presents the development process of a hand motion-sensing glove designed to complement an existing ankle-worn motion sensor. The glove aims to provide a more immersive exergaming experience by accurately tracking hand movements. Several prototypes were developed and iteratively improved to refine the design and functionality. This paper outlines the background of exergaming technology, reviews related work, and details the iterative development of the glove, discussing challenges encountered and improvements made. Finally, future steps for completing the final prototype are discussed.</dc:description>
</entry>
<entry>
<title>Contributions to the modeling and simulation of an automatic package classification system to improve decision-making in line balancing</title>
<link href="http://sedici.unlp.edu.ar:80/handle/10915/182610" rel="alternate"/>
<author>
<name>Acosta, Esteban</name>
</author>
<author>
<name>De Queiroz, Jose Antonio</name>
</author>
<author>
<name>Gaudiani, Adriana Angélica</name>
</author>
<id>http://sedici.unlp.edu.ar:80/handle/10915/182610</id>
<updated>2025-08-13T20:02:24Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Objeto de conferencia
XIII Jornadas de Cloud Computing, Big Data &amp; Emerging Topics (La Plata, 24 al 26 de junio de 2025); 13th Conference on Cloud Computing, Big Data &amp; Emerging Topics (JCC-BD&amp;ET 2025)
The growth of e-commerce has led to an increase in the complexity of automatic sorting systems (Sorters), especially evident during the COVID-19 pandemic. This is reflected in the growing number of destinations, variety of products, reduced batch sizes, diversity in box dimensions, varying routes, and the need for rapid response times, among other factors. These complexities hinder decision-making in the s ystem, particularly in developing a line balancing program for package unloading lines from the Sorter. Therefore, tools that support improved decision-making are required. This publication contributes, on one hand, to the conceptual modeling of the system using the IDEF-SIM conceptual modeling technique to better understand it.&#13;
On the other hand, it contributes to the construction of the simulation model using the FlexSim® software. Finally, a heuristic-based simulation optimization methodology is proposed to enhance decision-making in balancing the sorting line.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
<dc:description>The growth of e-commerce has led to an increase in the complexity of automatic sorting systems (Sorters), especially evident during the COVID-19 pandemic. This is reflected in the growing number of destinations, variety of products, reduced batch sizes, diversity in box dimensions, varying routes, and the need for rapid response times, among other factors. These complexities hinder decision-making in the s ystem, particularly in developing a line balancing program for package unloading lines from the Sorter. Therefore, tools that support improved decision-making are required. This publication contributes, on one hand, to the conceptual modeling of the system using the IDEF-SIM conceptual modeling technique to better understand it.&#13;
On the other hand, it contributes to the construction of the simulation model using the FlexSim® software. Finally, a heuristic-based simulation optimization methodology is proposed to enhance decision-making in balancing the sorting line.</dc:description>
</entry>
<entry>
<title>Automated and Secure Login in ALERTAR, a Resilient Cloud–Fog–Edge mHealth System for Hospital Environments</title>
<link href="http://sedici.unlp.edu.ar:80/handle/10915/182609" rel="alternate"/>
<author>
<name>Zanellato, Claudio</name>
</author>
<author>
<name>Cañibano, Rodrigo</name>
</author>
<author>
<name>Balladini, Javier</name>
</author>
<id>http://sedici.unlp.edu.ar:80/handle/10915/182609</id>
<updated>2025-08-13T20:02:24Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Objeto de conferencia
XIII Jornadas de Cloud Computing, Big Data &amp; Emerging Topics (La Plata, 24 al 26 de junio de 2025); 13th Conference on Cloud Computing, Big Data &amp; Emerging Topics (JCC-BD&amp;ET 2025)
ALERTAR system aims to assist healthcare providers in identifying early clinical deterioration in hospitalized patients on general wards. Its cloud-fog-edge architecture uses only mobile devices at the fog and edge levels to simplify its operability. As a critical healthcare system, it is of utmost importance to provide resilience across multiple layers of the architecture. To tolerate faults, the system can dynamically migrate devices between the fog and edge layers. This work focuses on describing an automated and secure login method for this system. To reduce user intervention and simplify system use, the login process has been automated by storing session credentials and utilizing a fog-level device discovery process. The design of the login mechanism is aligned with the strict security requirements of the system, considering the sensitivity of the data and the criticality of the service. This is a work in progress so performance evaluations are still pending.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
<dc:description>ALERTAR system aims to assist healthcare providers in identifying early clinical deterioration in hospitalized patients on general wards. Its cloud-fog-edge architecture uses only mobile devices at the fog and edge levels to simplify its operability. As a critical healthcare system, it is of utmost importance to provide resilience across multiple layers of the architecture. To tolerate faults, the system can dynamically migrate devices between the fog and edge layers. This work focuses on describing an automated and secure login method for this system. To reduce user intervention and simplify system use, the login process has been automated by storing session credentials and utilizing a fog-level device discovery process. The design of the login mechanism is aligned with the strict security requirements of the system, considering the sensitivity of the data and the criticality of the service. This is a work in progress so performance evaluations are still pending.</dc:description>
</entry>
<entry>
<title>WEEE Prediction Model Based on Neural Networks</title>
<link href="http://sedici.unlp.edu.ar:80/handle/10915/182608" rel="alternate"/>
<author>
<name>Facuy, Jussen</name>
</author>
<author>
<name>Pasini, Ariel Cristian</name>
</author>
<author>
<name>Estévez, Elsa Clara</name>
</author>
<author>
<name>Moran, César</name>
</author>
<id>http://sedici.unlp.edu.ar:80/handle/10915/182608</id>
<updated>2025-08-13T20:02:25Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Objeto de conferencia
XIII Jornadas de Cloud Computing, Big Data &amp; Emerging Topics (La Plata, 24 al 26 de junio de 2025); 13th Conference on Cloud Computing, Big Data &amp; Emerging Topics (JCC-BD&amp;ET 2025)
The need to develop intelligent and innovative solutions to reduce pollution generated by Waste Electrical and Electronic Equipment (WEEE) led to the construction of a WEEE prediction model based on neural networks. The information supporting the model is derived from data obtained through a survey, as well as historical data on WEEE generation in Ecuador. The model aims to estimate waste generation within a specific month and year. Neural network algorithms were used for the model's functionality due to their adaptability to dynamic data like the ones utilized. The development of this model considered five phases: data collection, preprocessing, model generation, model application, and verification and continuous improvement. It is concluded that the proposed model provides a detailed description of the architecture, phases, and procedures required for its operation, facilitating its understanding and subsequent implementation.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
<dc:description>The need to develop intelligent and innovative solutions to reduce pollution generated by Waste Electrical and Electronic Equipment (WEEE) led to the construction of a WEEE prediction model based on neural networks. The information supporting the model is derived from data obtained through a survey, as well as historical data on WEEE generation in Ecuador. The model aims to estimate waste generation within a specific month and year. Neural network algorithms were used for the model's functionality due to their adaptability to dynamic data like the ones utilized. The development of this model considered five phases: data collection, preprocessing, model generation, model application, and verification and continuous improvement. It is concluded that the proposed model provides a detailed description of the architecture, phases, and procedures required for its operation, facilitating its understanding and subsequent implementation.</dc:description>
</entry>
<entry>
<title>Rule-Based Matching for Real Estate Features Detection</title>
<link href="http://sedici.unlp.edu.ar:80/handle/10915/182588" rel="alternate"/>
<author>
<name>Ibáñez Gutkin, Mateo Agustín</name>
</author>
<author>
<name>Pagano, Alvaro A.</name>
</author>
<author>
<name>Bazzana Tanevitch, Luciana</name>
</author>
<author>
<name>Torres, Diego F.</name>
</author>
<id>http://sedici.unlp.edu.ar:80/handle/10915/182588</id>
<updated>2025-08-12T20:02:28Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Objeto de conferencia
XIII Jornadas de Cloud Computing, Big Data &amp; Emerging Topics (La Plata, 24 al 26 de junio de 2025); 13th Conference on Cloud Computing, Big Data &amp; Emerging Topics (JCC-BD&amp;ET 2025)
Most of the information about real estate for sale in the Buenos Aires province, Argentina is unstructured, which means that it does not always follow the same format, making extraction a challenging process. Variability in wording, human errors, noise, and incomplete data further complicate the task. Given the large volume of information available, automated techniques are required to transform unstructured text into structured data. This article presents an approach to extract attribute-value pairs from the information contained in the property listings for the province of Buenos Aires, in order to incorporate this data into a knowledge graph. The approach uses pattern-based information extraction for 17 features with an exhaustive evaluation over two datasets: a ground truth labeled by experts and a dataset containing a real-world use case. The results demonstrates accurate values.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
<dc:description>Most of the information about real estate for sale in the Buenos Aires province, Argentina is unstructured, which means that it does not always follow the same format, making extraction a challenging process. Variability in wording, human errors, noise, and incomplete data further complicate the task. Given the large volume of information available, automated techniques are required to transform unstructured text into structured data. This article presents an approach to extract attribute-value pairs from the information contained in the property listings for the province of Buenos Aires, in order to incorporate this data into a knowledge graph. The approach uses pattern-based information extraction for 17 features with an exhaustive evaluation over two datasets: a ground truth labeled by experts and a dataset containing a real-world use case. The results demonstrates accurate values.</dc:description>
</entry>
<entry>
<title>Pipeline to detect spike-and-wave EEG patterns based on polynomial regression modeling and Taylor series feature selection</title>
<link href="http://sedici.unlp.edu.ar:80/handle/10915/182586" rel="alternate"/>
<author>
<name>Adell, Matias F.</name>
</author>
<author>
<name>Balda, Javier</name>
</author>
<author>
<name>Casas, Facundo</name>
</author>
<author>
<name>D’Giano, Carlos</name>
</author>
<author>
<name>Quintero-Rincón, Antonio</name>
</author>
<id>http://sedici.unlp.edu.ar:80/handle/10915/182586</id>
<updated>2025-08-12T20:02:29Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Objeto de conferencia
XIII Jornadas de Cloud Computing, Big Data &amp; Emerging Topics (La Plata, 24 al 26 de junio de 2025); 13th Conference on Cloud Computing, Big Data &amp; Emerging Topics (JCC-BD&amp;ET 2025)
Epilepsy is a common neurological disorder diagnosed and monitored through EEG recordings. Accurate spike-and-wave (SW) pattern classification is crucial for distinguishing this epileptic seizure disorder from normal brain wave activity (NW). However, mathematically modeling SW remains challenging, affecting classification accuracy. This study proposes a pipeline in two stages combining polynomial regression techniques, and data processing, in a machine-learning classification scheme. At the first stage of decision-making, the idea is to create a generalized waveform mother that represents all the waveforms of the EEG patterns, such as SW and NW. This waveform is derived from a polynomial regression model that is assessed by the truncation error of the Taylor series. In the second stage, a feature selection algorithm based on a vector that includes the coefficients from Taylor and the statistical properties of the SW and NW waveforms was designed for the machine learning classifier. This algorithm uses the confidence interval to extract the Taylor series points that do not represent the generalized mother equation. This yields a dimensional reduction of this vector, which can be used in a classification and detection scheme. Three polynomial regression models, such as Fourier, Gaussian, and sums-of-sines were evaluated using the pipeline methodology. The best model was the Fourier regression, which achieved an accuracy of 96.2% using the SVM classifier with a Gaussian kernel to detect spike-and-wave patterns.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
<dc:description>Epilepsy is a common neurological disorder diagnosed and monitored through EEG recordings. Accurate spike-and-wave (SW) pattern classification is crucial for distinguishing this epileptic seizure disorder from normal brain wave activity (NW). However, mathematically modeling SW remains challenging, affecting classification accuracy. This study proposes a pipeline in two stages combining polynomial regression techniques, and data processing, in a machine-learning classification scheme. At the first stage of decision-making, the idea is to create a generalized waveform mother that represents all the waveforms of the EEG patterns, such as SW and NW. This waveform is derived from a polynomial regression model that is assessed by the truncation error of the Taylor series. In the second stage, a feature selection algorithm based on a vector that includes the coefficients from Taylor and the statistical properties of the SW and NW waveforms was designed for the machine learning classifier. This algorithm uses the confidence interval to extract the Taylor series points that do not represent the generalized mother equation. This yields a dimensional reduction of this vector, which can be used in a classification and detection scheme. Three polynomial regression models, such as Fourier, Gaussian, and sums-of-sines were evaluated using the pipeline methodology. The best model was the Fourier regression, which achieved an accuracy of 96.2% using the SVM classifier with a Gaussian kernel to detect spike-and-wave patterns.</dc:description>
</entry>
<entry>
<title>Orthogonal Moments-Based Feature Extraction for MRI Classification</title>
<link href="http://sedici.unlp.edu.ar:80/handle/10915/182583" rel="alternate"/>
<author>
<name>Degiuseppe, Gonzalo</name>
</author>
<author>
<name>Quintero-Rincón, Antonio</name>
</author>
<id>http://sedici.unlp.edu.ar:80/handle/10915/182583</id>
<updated>2025-08-12T20:02:29Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Objeto de conferencia
XIII Jornadas de Cloud Computing, Big Data &amp; Emerging Topics (La Plata, 24 al 26 de junio de 2025); 13th Conference on Cloud Computing, Big Data &amp; Emerging Topics (JCC-BD&amp;ET 2025)
Orthogonal moments are a current area of research in image analysis and pattern recognition. They are numerical values obtained by projecting an image intensity function onto a polynomial basis of the 2D coordinates to describe the distribution of pixels in an image space.&#13;
This work proposes using orthogonal moments in MRI images as a feature extraction tool for detecting and classifying brain tumors, including gliomas, meningiomas, and pituitary cases. The method has three stages and employs the Random Forest model (RF) as its core foundation. In the first stage, Legendre Moments and the First and Second Order Chebyshev Moments are analyzed to extract features based on the weighted average of MRI image pixel intensities. In the second stage, the feature selection vector is calculated using the orthogonal moment features obtained in the previous stage. RF determines the majority vote for each class, while the Gini coefficient evaluates its concentration, leading to dimensionality reduction. In the final stage, the feature vector is utilized in a multiclass classifier framework based on RF to diagnose the type of brain tumor. The proposed methodology achieved an average accuracy of 96.49% across all brain tumor detection. Preliminary results indicate that this family of descriptors has significant potential for feature extraction in detecting brain tumors in MRI images.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
<dc:description>Orthogonal moments are a current area of research in image analysis and pattern recognition. They are numerical values obtained by projecting an image intensity function onto a polynomial basis of the 2D coordinates to describe the distribution of pixels in an image space.&#13;
This work proposes using orthogonal moments in MRI images as a feature extraction tool for detecting and classifying brain tumors, including gliomas, meningiomas, and pituitary cases. The method has three stages and employs the Random Forest model (RF) as its core foundation. In the first stage, Legendre Moments and the First and Second Order Chebyshev Moments are analyzed to extract features based on the weighted average of MRI image pixel intensities. In the second stage, the feature selection vector is calculated using the orthogonal moment features obtained in the previous stage. RF determines the majority vote for each class, while the Gini coefficient evaluates its concentration, leading to dimensionality reduction. In the final stage, the feature vector is utilized in a multiclass classifier framework based on RF to diagnose the type of brain tumor. The proposed methodology achieved an average accuracy of 96.49% across all brain tumor detection. Preliminary results indicate that this family of descriptors has significant potential for feature extraction in detecting brain tumors in MRI images.</dc:description>
</entry>
<entry>
<title>Development and Implementation of an AWS-Based Platform for Automated Prediction&#13;
of Antibiotic Resistance from Mass Spectrometry Profiles</title>
<link href="http://sedici.unlp.edu.ar:80/handle/10915/182579" rel="alternate"/>
<author>
<name>Flores Estay, Moises E.</name>
</author>
<author>
<name>Lopéz Cortés, Xaviera A.</name>
</author>
<author>
<name>Tirado, Felipe</name>
</author>
<author>
<name>Bernal Osses, José J. I.</name>
</author>
<author>
<name>Manríquez-Troncoso, José M.</name>
</author>
<id>http://sedici.unlp.edu.ar:80/handle/10915/182579</id>
<updated>2025-08-12T20:02:29Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Objeto de conferencia
XIII Jornadas de Cloud Computing, Big Data &amp; Emerging Topics (La Plata, 24 al 26 de junio de 2025); 13th Conference on Cloud Computing, Big Data &amp; Emerging Topics (JCC-BD&amp;ET 2025)
Antimicrobial resistance (AMR) is a growing threat to global public health, yet few studies have addressed the deployment of intelligent systems for its early prediction in clinical settings. This work presents the development and deployment of a cloud-based web platform that leverages artificial intelligence to predict bacterial resistance using MALDI-TOF mass spectrometry data.&#13;
The system was developed using a modular architecture deployed on AmazonWeb Services (AWS), combining serverless components and dedicated instances for efficient and scalable operation. A total of 316 clinical isolates of Escherichia coli were collected from the Regional Hospital of Talca, Chile, between 2022 and 2023. A benchmarking analysis comparing CatBoost, XGBoost, and LightGBM was conducted to select the most effective boosting algorithm.&#13;
The best performance was achieved with Catboost for the Ciprofloxacin case, reaching an AUROC and AUPRC of 0.91. Results for Ceftriaxone were slightly lower, likely due to class imbalance. These outcomes highlight the robustness of boosting models even under real-world data constraints.&#13;
The entire platform was deployed on Amazon Web Services (AWS) using a modular serverless architecture, enabling scalability, cost-efficiency, and easy integration into hospital workflows. This study demonstrates the feasibility of integrating AI-powered prediction systems into clinical environments to support timely and data-driven antimicrobial resistance management.
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
<dc:description>Antimicrobial resistance (AMR) is a growing threat to global public health, yet few studies have addressed the deployment of intelligent systems for its early prediction in clinical settings. This work presents the development and deployment of a cloud-based web platform that leverages artificial intelligence to predict bacterial resistance using MALDI-TOF mass spectrometry data.&#13;
The system was developed using a modular architecture deployed on AmazonWeb Services (AWS), combining serverless components and dedicated instances for efficient and scalable operation. A total of 316 clinical isolates of Escherichia coli were collected from the Regional Hospital of Talca, Chile, between 2022 and 2023. A benchmarking analysis comparing CatBoost, XGBoost, and LightGBM was conducted to select the most effective boosting algorithm.&#13;
The best performance was achieved with Catboost for the Ciprofloxacin case, reaching an AUROC and AUPRC of 0.91. Results for Ceftriaxone were slightly lower, likely due to class imbalance. These outcomes highlight the robustness of boosting models even under real-world data constraints.&#13;
The entire platform was deployed on Amazon Web Services (AWS) using a modular serverless architecture, enabling scalability, cost-efficiency, and easy integration into hospital workflows. This study demonstrates the feasibility of integrating AI-powered prediction systems into clinical environments to support timely and data-driven antimicrobial resistance management.</dc:description>
</entry>
<entry>
<title>13th Conference on Cloud Computing, Big Data &amp; Emerging Topics (JCC-BD&amp;ET 2025)</title>
<link href="http://sedici.unlp.edu.ar:80/handle/10915/182427" rel="alternate"/>
<author>
<name>Naiouf, Marcelo Ricardo</name>
</author>
<author>
<name>Chichizola, Franco</name>
</author>
<author>
<name>De Giusti, Laura Cristina</name>
</author>
<author>
<name>Libutti, Leandro Ariel</name>
</author>
<id>http://sedici.unlp.edu.ar:80/handle/10915/182427</id>
<updated>2025-08-11T20:02:28Z</updated>
<published>2025-01-01T00:00:00Z</published>
<summary type="text">Libro
Naiouf, Marcelo Ricardo; Chichizola, Franco; De Giusti, Laura Cristina; Libutti, Leandro Ariel
Proceedings of the 13th Conference on Cloud Computing, Big Data &amp; Emerging Topics (JCC-BD&amp;ET 2025), held in a hybrid modality (both on-site and live online settings were allowed). JCC-BD&amp;ET 2025 was organized by the III-LIDI and the Postgraduate Office, both from School of Computer Science of the National University of La Plata.JCC-BD&amp;ET 2025 covered the following topics: high-performance, edge and fog computing; internet of things; modelling and simulation; big and open data; machine and deep learning; smart cities; e-government; human-computer interaction; visualization; and special topics related to emerging technologies.
Facultad de Informática (UNLP)
</summary>
<dc:date>2025-01-01T00:00:00Z</dc:date>
<dc:description>Proceedings of the 13th Conference on Cloud Computing, Big Data &amp; Emerging Topics (JCC-BD&amp;ET 2025), held in a hybrid modality (both on-site and live online settings were allowed). JCC-BD&amp;ET 2025 was organized by the III-LIDI and the Postgraduate Office, both from School of Computer Science of the National University of La Plata.JCC-BD&amp;ET 2025 covered the following topics: high-performance, edge and fog computing; internet of things; modelling and simulation; big and open data; machine and deep learning; smart cities; e-government; human-computer interaction; visualization; and special topics related to emerging technologies.</dc:description>
</entry>
</feed>
