Amany gamel's Projects
when enter year for example 2002 result subtraction from this year from year enter , result 21
Config files for my GitHub profile.
task in cloud Requested that I disable the drupal site until the database and access through localhost and save volume in order to keep it on disk
ARG java to dir target/spring-boot-web.jar,# JAR_FILE target/spring-boot-web.jar,# WORKDIR /opt/opp,# copy JAR_FILE to app.jar
The open-source repo for docs.github.com
This iot was considered when it senses someone's hand and opens the door automatically
This iot is seeing that there is gas in the phone. There is no gas. The button gives a loud sound and sends mail
HTTP Server -> localhost
Image Filters, (Noise removal in Image processing), this is our project in Algorithms Design and Analysis
This is done in iot, connecting connections and writing code that operates three LEDs and moves between one LED and the other
Liver tumor is one of the most dangerous diseases and can develop fast if not treated right ,so we wanted to make a program to help doctors and patients with their examination
simple_login in react native
Infrastructure Provisioning with Terraform, Configuration Management with Ansible,Containerization with Docker,Continuous Integration with Jenkins,Automated Deployment Pipeline,Monitoring and Logging, AWS Integration
simple_page_instagram with react native
split picture into sagittal, coronal , Axial , segment liver from picture and segment tumor from liver and classify the picture if found the tumor or no tumor
classify based on image belong to each class with probability for each class
jenkins
this is simple_calculater for sum , subtract , sin , cos, divide can apply more function and feature
Backpropagation is a process involved in training a neural network. It involves taking the error rate of a forward propagation and feeding this loss backward through the neural network layers to fine-tune the weights.
Backpropagation is the essence of neural net training. It is the practice of fine-tuning the weights of a neural net based on the error rate (i.e. loss) obtained in the previous epoch (i.e. iteration.) Proper tuning of the weights ensures lower error rates, making the model reliable by increasing its generalization.
Backpropagation is the essence of neural net training. It is the practice of fine-tuning the weights of a neural net based on the error rate (i.e. loss) obtained in the previous epoch (i.e. iteration.) Proper tuning of the weights ensures lower error rates, making the model reliable by increasing its generalization.
TASC is divided into two parts, single read or double read. It is an Elurarin Path mapping reads into reference genome
to_do_list contains some feature add,remove,update ,insert in React native