wh >
  • hk >
  • Import data of auto spare parts
  • Convert dataframe to bytes python

    rn

    kx

    This function must return a unicode string and will be applied only to the non- NaN elements, with NaN being handled by na_rep. Changed in version 1.2.0. sparsifybool, optional, default True.. To convert a Pandas DataFrame to a NumPy array we can use the values method ( DataFrame. to_numpy () ). For instance, if we want to convert our dataframe called df we can add this code: np_array = df. to_numpy (). Save. 2 methods to convert dataframe to numpy array. Here we are using two libraries one is <b>NumPy</b> and the other is <b>datetime</b>.Let's import it using. . Count new pairs of strings that can be obtained by swapping first characters of pairs of strings from given array. Dictionary in Python) 2) Store frequencies of all elements. The. Let us see how to convert a DataFrame to a list of dictionaries by using the df.to_dict () method. In Python DataFrame.to_dict () method is used to covert a dataframe into. What changes were proposed in this pull request? Integrate Apache Arrow with Spark to increase performance of DataFrame.toPandas. This has been done by using Arrow to convert data partitions on the executor JVM to Arrow payload byte arrays where they are then served to the Python process. The Python DataFrame can then collect the Arrow payloads where they are. Answer. You can also use the struct module to do this: >>> struct.unpack ("<L", "y\xcc\xa6\xbb") [0] 3148270713L. This question is answered By – Greg Hewgill. This answer. my_df: df = pd.DataFrame ( {'A': [1,2,3],'B': [4,5,6]}) my code: import io towrite = io.BytesIO () df.to_excel (towrite) # write to BytesIO buffer towrite.seek (0) # reset pointer I am getting AttributeError: '_io.BytesIO' object has no attribute 'write_cells' Full Traceback:. Code #1: Convert the Weight column data type. import pandas as pd df = pd.read_csv ("nba.csv") df [:10] As the data have some "nan" values so, to avoid any error we will drop all the rows containing any nan values. df.dropna (inplace = True) before = type(df.Weight [0]) df.Weight = df.We<strong>ight.astype ('int64') after = type(df.Weight [0]). You can use df.to_records () to convert your dataframe to a numpy recarray, then call .tostring () to convert this to a string of bytes:. The following snippets indicate the conversion of byte to int object. Example 1: Python3 byte_val = b'\x00\x01' int_val = int.from_bytes (byte_val, "big") print(int_val) Output: 1 Example 2: Python3 byte_val = b'\x00\x10' int_val = int.from_bytes (byte_val, "little") print(int_val) Output: 4096 Example 3: Python3 byte_val = b'\xfc\x00'. Here, we will discussing all the different ways through which we can convert bytes to string: 1. Using map () without using b prefix In this example, we will be using the map. You can use df.to_records () to convert your dataframe to a numpy recarray, then call .tostring () to convert this to a string of bytes: 13 1 rec = df.to_records(index=False) 2 3 print(repr(rec)) 4 #. pandas.DataFrame or pandas.Series and numpy.ndarray converted to each other by values attribute or constructor may share memory. If memory is shared, changing one changes the other. Be careful if you want to use each separately. The following sample code and results are for pandas 0.25.1. values attribute First, consider the following case. Answer. You can also use the struct module to do this: >>> struct.unpack ("<L", "y\xcc\xa6\xbb") [0] 3148270713L. This question is answered By – Greg Hewgill. This answer. Method 1: int.tobytes () An int value can be converted into bytes by using the method int.to_bytes (). The method is invoked on an int value, is not supported by Python 2. In the following given code first, we have imported the tensorflow and pandas library and then created a dataframe by using the pd.DataFrame () function in which we assigned two. To read a JSON file via Pandas, we can use the read_json () method. df = pd.read_json ('data/simple.json') image by author The result looks great. Let's take a look at the data types with df.info (). By default, columns that are numerical are cast to numeric types, for example, the math, physics, and chemistry columns have been cast to int64. Pandas : How to convert bytes data into a python pandas dataframe? [ Beautify Your Computer : https://www.hows.tech/p/recommended.html ] Pandas : How to con. Here are the list of ways that you can use to convert a dataframe to a set in Python: Table of contents. Using for loop. Using list Comprehension. Using itertuples () Using flatten () 1. from geopandas import GeoDataFrame from shapely.geometry import Point geometry = [Point (xy) for xy in zip (df.Lon, df.Lat)] df = df.drop ( ['Lon', 'Lat'], axis=1) gdf = GeoDataFrame (df, crs="EPSG:4326", geometry=geometry) Result:. What changes were proposed in this pull request? Integrate Apache Arrow with Spark to increase performance of DataFrame.toPandas. This has been done by using Arrow to convert data partitions on the executor JVM to Arrow payload byte arrays where they are then served to the Python process. The Python DataFrame can then collect the Arrow payloads where they are. The first thing to do is to install the python-postgres wrapper, psycopg2, to make it easier to connect to your database via python. I’m also importing the necessary libraries to do. In this case, to convert it to Pandas DataFrame we will need to use the .json_normalize () method. It works differently than .read_json () and normalizes semi-structured JSON into a flat table: import pandas as pd import json with open('nested_sample.json','r') as f: data = json.loads(f.read()) df = pd.json_normalize(data). Answer. You can also use the struct module to do this: >>> struct.unpack ("<L", "y\xcc\xa6\xbb") [0] 3148270713L. This question is answered By – Greg Hewgill. This answer. The convert_dtypes() method returns a new DataFrame where each column has been changed to the best possible data type. Syntax dataframe .convert_dtypes(infer_objects, convert_string,. The official dedicated python forum. I am using a pandas dataframe and creating plots and one of the columns is dtype: object. I would like to convert these values into strings, how would i go about that? ... convert a list of string+bytes into a list of strings (python 3) pacscaloupsu: 4: 8,618: Mar-17-2020, 07:21 AM Last Post: markfilan :. This function must return a unicode string and will be applied only to the non- NaN elements, with NaN being handled by na_rep. Changed in version 1.2.0. sparsifybool, optional, default True.. Answer. You can also use the struct module to do this: >>> struct.unpack ("<L", "y\xcc\xa6\xbb") [0] 3148270713L. This question is answered By – Greg Hewgill. This answer. DataFrame.convert_dtypes(infer_objects=True, convert_string=True, convert_integer=True, convert_boolean=True, convert_floating=True) [source] # Convert columns to best possible dtypes using dtypes supporting pd.NA. New in version 1.0.0. Parameters infer_objectsbool, default True Whether object dtypes should be converted to the best possible types. Positional formatting is one of the oldest methods in Python to convert an integer into a string.. 1.14.1. Floats, Division, Mixed Types ¶. As you moved on in school from your first integer division to fractions and decimals, you probably thought of 6/8 as a fraction and could convert to a decimal .75. Python can do decimal calculations, too. Python BytesIO. Just like what we do with variables, data can be kept as bytes in an in-memory buffer when we use the io module's Byte IO operations. Here is a sample program to demonstrate this: import io stream_str = io.BytesIO (b"JournalDev Python: \x00\x01") print (stream_str.getvalue ()) Let's see the output for this program: The. What changes were proposed in this pull request? Integrate Apache Arrow with Spark to increase performance of DataFrame.toPandas. This has been done by using Arrow to convert data partitions on the executor JVM to Arrow payload byte arrays where they are then served to the Python process. The Python DataFrame can then collect the Arrow payloads where they are. I have the following dataset in df_1 which I want to convert into the format of df_2.In df_2 I have converted the columns of df_1 to rows in df_2 (excluding UserId and Date).I looked. Pickle. Pickle is a reproducible format for a Pandas dataframe, but it's only for internal use among trusted users. It's not for sharing with untrusted users due to security reasons. import pickle #. 9. I think you can simply call the PIL image's.tobytes method, and from there, to convert it to an array, use the bytes built. Use new String() to Convert a Byte Array to a String in Java ; Use getBytes() to Convert a String to a Byte Array in Java ; This. how to use neighbors wifi without them knowing. Convert byte array to string javascript. Count new pairs of strings that can be obtained by swapping first characters of pairs of strings from given array. Dictionary in Python) 2) Store frequencies of all elements. The. Answer. You can also use the struct module to do this: >>> struct.unpack ("<L", "y\xcc\xa6\xbb") [0] 3148270713L. This question is answered By – Greg Hewgill. This answer. The following code (and output) makes me think I have successfully converted a txt file to "bytes". Code Block #1: readrawdata = streaming_body_1.read () print (type (readrawdata)) print. Example 3: Transform pandas DataFrame to NumPy Array Using values Attribute. So far, we have used the to_numpy function to change from the pandas DataFrame class to the NumPy array. Filter similar rows based on timestamp as index in a dataframe; Python Data Frame - nested comma in csv file; Pandas function error: The truth value of a Series is ambiguous; How to get the index of a value in a pandas series; Reorganize Dataframe to Multi Index; For loop and replace part string in paths based on dataframe two columns in Python. attempt to decode all the bytes to text in to_csv before writing, using the provided encoding. Raise an error, directing the user to perform the decoding before attempting to_csv. mentioned this issue on Aug 3, 2016. added a commit to jzwinck/pandas that referenced this issue on Aug 3, 2016. goodboy mentioned this issue on Aug 29, 2016. Method 1: int.tobytes () An int value can be converted into bytes by using the method int.to_bytes (). The method is invoked on an int value, is not supported by Python 2 (requires minimum Python3) for execution. Syntax: int.to_bytes (length, byteorder) Arguments : length - desired length of the array in bytes. DataFrame.convert_dtypes(infer_objects=True, convert_string=True, convert_integer=True, convert_boolean=True, convert_floating=True) [source] # Convert columns to best possible. If the object has the custom, Now, the hash of the integer is the integer itself, but the, Let us start constructing simple examples and scenarios in which the, How to define a ha. 9. I think you can simply call the PIL image's.tobytes method, and from there, to convert it to an array, use the bytes built. Use new String() to Convert a Byte Array to a String in Java ; Use getBytes() to Convert a String to a Byte Array in Java ; This. how to use neighbors wifi without them knowing. Convert byte array to string javascript. In the following given code first, we have imported the tensorflow and pandas library and then created a dataframe by using the pd.DataFrame () function in which we assigned two. DataFrame.to_parquet(path=None, engine='auto', compression='snappy', index=None, partition_cols=None, storage_options=None, **kwargs) [source] # Write a DataFrame to the binary parquet format. This function writes the dataframe as a parquet file. You can choose different parquet backends, and have the option of compression. Method 1: int.tobytes () An int value can be converted into bytes by using the method int.to_bytes (). The method is invoked on an int value, is not supported by Python 2 (requires minimum Python3) for execution. Syntax: int.to_bytes (length, byteorder) Arguments : length - desired length of the array in bytes. The following snippets indicate the conversion of byte to int object. Example 1: Python3 byte_val = b'\x00\x01' int_val = int.from_bytes (byte_val, "big") print(int_val) Output: 1 Example 2: Python3 byte_val = b'\x00\x10' int_val = int.from_bytes (byte_val, "little") print(int_val) Output: 4096 Example 3: Python3 byte_val = b'\xfc\x00'. If a user chooses to load CSV data as bytes it should be specified explicitly just like it works when you write out unicode and not inferred from python's encoding specific markup:. To convert negative integers to bytes with the int.to_bytes () function, pass signed=True. It will use two’s complement to represent the integer. # integer variable num = -7 # integer to bytes. To recap, there is a bunch of ways to convert bytes to strings in Python. To convert a byte string to a string, use the bytes.decode() method. If you have a list of bytes, call chr() function on. If the object has the custom, Now, the hash of the integer is the integer itself, but the, Let us start constructing simple examples and scenarios in which the, How to define a ha. . Pickle. Pickle is a reproducible format for a Pandas dataframe, but it's only for internal use among trusted users. It's not for sharing with untrusted users due to security reasons. import pickle #. On the off chance that doesn't work, you can simply convert BytesIO to a another io Writer/Reader/Wrapper by passing it to the constructor. Example: . 9 1 import io 2 3 b = io.BytesIO(b"Hello World") ## Some random BytesIO Object 4 print(type(b)) ## For sanity's sake 5 with open("test.xlsx") as f: ## Excel File 6. If the object has the custom, Now, the hash of the integer is the integer itself, but the, Let us start constructing simple examples and scenarios in which the, How to define a ha. Pandas : How to convert bytes data into a python pandas dataframe? [ Beautify Your Computer : https://www.hows.tech/p/recommended.html ] Pandas : How to con. Use the astype () method in Pandas to convert one datatype to another. Import the required library −. import pandas as pd. Create a DataFrame. Here, we have 2 columns,. A python (3.6+) module that wraps pdftoppm and pdftocairo to convert PDF to a PIL Image object. How to install. pip install pdf2image. Windows. Windows users will have to build or download poppler for Windows. I recommend @oschwartz10612 version which is the most up-to-date. jeep tj sound system ideas. (Spark with Python) PySpark DataFrame can be converted to Python pandas DataFrame using a function toPandas(), In this article, I will explain how to create Pandas DataFrame from PySpark (Spark) DataFrame with examples. Before we start first understand the main differences between the Pandas & PySpark, operations on Pyspark run faster than Pandas due to its distributed nature and parallel. What changes were proposed in this pull request? Integrate Apache Arrow with Spark to increase performance of DataFrame.toPandas. This has been done by using Arrow to convert data partitions on the executor JVM to Arrow payload byte arrays where they are then served to the Python process. The Python DataFrame can then collect the Arrow payloads where they are. To convert an array to a dataframe with Python you need to 1) have your NumPy array (e.g., np_array), and 2) use the pd. DataFrame () constructor like this: df = pd. DataFrame (np_array, columns= ['Column1', 'Column2']). Remember, that each column in your NumPy array needs to be named with columns. trichy item number telegram group link. Method 1: int.tobytes () An int value can be converted into bytes by using the method int.to_bytes (). The method is invoked on an int value, is not supported by Python 2. Method No 1. Using the Matplotlib library in Python to generate PDF from a dataframe. Using the matplotib library in python, we can first convert the pandas dataframe. Let us see how to convert a DataFrame to a list of dictionaries by using the df.to_dict () method. In Python DataFrame.to_dict () method is used to covert a dataframe into. DataFrame.to_parquet(path=None, engine='auto', compression='snappy', index=None, partition_cols=None, storage_options=None, **kwargs) [source] # Write a DataFrame to the binary parquet format. This function writes the dataframe as a parquet file. You can choose different parquet backends, and have the option of compression. Formatter functions to apply to columns' elements by position or name. The result of each function must be a unicode string. List/tuple must be of length equal to the number of columns. float_formatone-parameter function, optional, default None. Formatter function to apply to columns' elements if they are floats. . Arrow is available as an optimization when converting a PySpark DataFrame to a pandas DataFrame with toPandas () and when creating a PySpark DataFrame from a pandas DataFrame with createDataFrame (pandas_df). To use Arrow for these methods, set the Spark configuration spark.sql.execution.arrow.pyspark.enabled to true. When we access the data again using python read file operation, it is decoded into the corresponding text, image, or audio. Byte objects contain data that are machine-readable. Agregar una nueva columna a un DataFrame existente en Pandas — ADDING NEW COLUMN TO EXISTING DATAFRAME IN PANDAS. Obtenga los mejores tutoriales de Python de forma. Sorted by: 2 Try the following code: import requests import pandas as pd import io params = { "api_key": "abc", "format": "csv" } r = requests.get ('https://www.parsehub.com/api/v2/runs/ttx8PT-EL6Rf/data', params=params) r = r.content rawData = pd.read_csv (io.StringIO (r.decode ('utf-8'))) Share Improve this answer Follow. . If the object has the custom, Now, the hash of the integer is the integer itself, but the, Let us start constructing simple examples and scenarios in which the, How to define a ha. Use the get_dummies () method to convert categorical DataFrame to binary data. Following is our Pandas DataFrame with 2 columns − dataFrame = pd. DataFrame ( { "Student":. A python (3.6+) module that wraps pdftoppm and pdftocairo to convert PDF to a PIL Image object. How to install. pip install pdf2image. Windows. Windows users will have to build or download poppler for Windows. I recommend @oschwartz10612 version which is the most up-to-date. jeep tj sound system ideas. What changes were proposed in this pull request? Integrate Apache Arrow with Spark to increase performance of DataFrame.toPandas. This has been done by using Arrow to convert data. using dataframe.to_numpy () method we can convert any dataframe to a numpy array. Numpy array generated after this method do not have headers by default. Though the. To read a JSON file via Pandas, we can use the read_json () method. df = pd.read_json ('data/simple.json') image by author The result looks great. Let's take a look at the data types with df.info (). By default, columns that are numerical are cast to numeric types, for example, the math, physics, and chemistry columns have been cast to int64. Code #1: Convert the Weight column data type. import pandas as pd df = pd.read_csv ("nba.csv") df [:10] As the data have some "nan" values so, to avoid any error we will drop all the rows containing any nan values. df.dropna (inplace = True) before = type(df.Weight [0]) df.Weight = df.We<strong>ight.astype ('int64') after = type(df.Weight [0]). You can use df.to_records () to convert your dataframe to a numpy recarray, then call .tostring () to convert this to a string of bytes: 13 1 rec = df.to_records(index=False) 2 3 print(repr(rec)) 4 # rec.array ( [ (10, 18446744073709551615, 13240000000.0), (15, 230498234019, 3.14159), 5 # (20, 32094812309, 234.1341)], 6. Use the get_dummies () method to convert categorical DataFrame to binary data. Following is our Pandas DataFrame with 2 columns − dataFrame = pd. DataFrame ( { "Student":. DataFrame.to_parquet(path=None, engine='auto', compression='snappy', index=None, partition_cols=None, storage_options=None, **kwargs) [source] #. Write a DataFrame to the. DataFrame.to_parquet(path=None, engine='auto', compression='snappy', index=None, partition_cols=None, storage_options=None, **kwargs) [source] #. Write a DataFrame to the. source (Optional) - source to initialize the array of bytes. encoding (Optional) - if the source is a string, the encoding of the string. errors (Optional) - if the source is a string, the action to take when the encoding conversion fails (Read more: String encoding) The source parameter can be used to initialize the byte array in the following. DataFrame.convert_dtypes(infer_objects=True, convert_string=True, convert_integer=True, convert_boolean=True, convert_floating=True) [source] # Convert columns to best possible dtypes using dtypes supporting pd.NA. New in version 1.0.0. Parameters infer_objectsbool, default True Whether object dtypes should be converted to the best possible types. To recap, there is a bunch of ways to convert bytes to strings in Python. To convert a byte string to a string, use the bytes.decode() method. If you have a list of bytes, call chr() function on. Arrow is available as an optimization when converting a PySpark DataFrame to a pandas DataFrame with toPandas () and when creating a PySpark DataFrame from a pandas DataFrame with createDataFrame (pandas_df). To use Arrow for these methods, set the Spark configuration spark.sql.execution.arrow.pyspark.enabled to true. Positional formatting is one of the oldest methods in Python to convert an integer into a string.. 1.14.1. Floats, Division, Mixed Types ¶. As you moved on in school from your first integer division to fractions and decimals, you probably thought of 6/8 as a fraction and could convert to a decimal .75. Python can do decimal calculations, too. You can use df.to_records () to convert your dataframe to a numpy recarray, then call .tostring () to convert this to a string of bytes:. On an STM32 MCU, we get to choose between essentially an active delay (while loop), one implemented using the SysTick timer and using one of the. rtel wrote on Thursday, March 31, 2016:. Thinking about this more, perhaps the SysTick can run at a higher priority if it is being called by the ST SysTick interrupt handler, rather than the FreeRTOS SysTick interrupt handler. To recap, there is a bunch of ways to convert bytes to strings in Python. To convert a byte string to a string, use the bytes.decode() method. If you have a list of bytes, call chr() function on. Get started working with Python, Boto3, and AWS S3. Learn how to create objects, upload them to S3, download their contents, and change their attributes One of its core components is S3, the object storage service offered by AWS. With its impressive availability and durability, it has become the. What changes were proposed in this pull request? Integrate Apache Arrow with Spark to increase performance of DataFrame.toPandas. This has been done by using Arrow to convert data. In the following given code first, we have imported the tensorflow and pandas library and then created a dataframe by using the pd.DataFrame () function in which we assigned two. Use the get_dummies () method to convert categorical DataFrame to binary data. Following is our Pandas DataFrame with 2 columns − dataFrame = pd. DataFrame ( { "Student":. # Convert bytearray to bytes byteObj = bytes( byteArrObj) print("\nThe output of bytes () method :\n", byteObj) # Convert bytes value into string using emcoding print("\nThe string values of. To convert negative integers to bytes with the int.to_bytes () function, pass signed=True. It will use two’s complement to represent the integer. # integer variable num = -7 # integer to bytes. You can use df.to_records () to convert your dataframe to a numpy recarray, then call .tostring () to convert this to a string of bytes:. I have a Python datetime an object that I want to convert to UNIX time, or seconds/milliseconds since the 1970 epoch. Epoch Converter is a unix time converter to convert date time to unix time and vice versa. In [ 21 ]: datetime. to_datetime(df.Therefore, the unix time is merely the number of seconds between a particular date and the Unix Epoch.. bool, (default False) Do not add the old. What changes were proposed in this pull request? Integrate Apache Arrow with Spark to increase performance of DataFrame.toPandas. This has been done by using Arrow to convert data. Now we've successfully converted a string into hex, let's proceed on how to convert a hex into a byte. Use bytes.fromhex() to Convert Hex to Byte in Python. The function bytes.fromhex() accepts a single hexadecimal value argument and converts it into a byte literal. Taking the hex value from the previous result, use fromhex() to convert it. . As part of your data wrangling you might need to cast a Pandas DataFrame column to the integer data type. We will go ahead and look into three main cases: Casting a specific column from. I have a Python datetime an object that I want to convert to UNIX time, or seconds/milliseconds since the 1970 epoch. Epoch Converter is a unix time converter to convert date time to unix time and vice versa. In [ 21 ]: datetime. to_datetime(df.Therefore, the unix time is merely the number of seconds between a particular date and the Unix Epoch.. bool, (default False) Do not add the old. Get started working with Python, Boto3, and AWS S3. Learn how to create objects, upload them to S3, download their contents, and change their attributes One of its core components is S3, the object storage service offered by AWS. With its impressive availability and durability, it has become the. Method 1: int.tobytes () An int value can be converted into bytes by using the method int.to_bytes (). The method is invoked on an int value, is not supported by Python 2. Use the get_dummies () method to convert categorical DataFrame to binary data. Following is our Pandas DataFrame with 2 columns − dataFrame = pd. DataFrame ( { "Student": ['Jack', 'Robin', 'Ted', 'Scarlett', 'Kat'],"Result": ['Pass', 'Fail', 'Fail', 'Pass', 'Pass'] } ). Method 1: int.tobytes () An int value can be converted into bytes by using the method int.to_bytes (). The method is invoked on an int value, is not supported by Python 2 (requires minimum Python3) for execution. Syntax: int.to_bytes (length, byteorder) Arguments : length - desired length of the array in bytes. To convert a Pandas DataFrame to a NumPy array we can use the values method ( DataFrame. to_numpy () ). For instance, if we want to convert our dataframe called df we can add this code: np_array = df. to_numpy (). Save. 2 methods to convert dataframe to numpy array. Here we are using two libraries one is <b>NumPy</b> and the other is <b>datetime</b>.Let's import it using.

    pe

    bq










    lo
    uq   ax   cu
    DateHS CodeDescriptionOrigin CountryPort of DischargeUnitQuantityValue (INR)Per Unit (INR)
    jnzmrzhr39150T9AT01 ANT ASSY,AUTO RADIO (SPARE PARTS FOR HONDA AUTOMOBILES)ThailandDadri-ACPL CFSPCS11,9251,925
    axbkpkcu31170RWK045 TENSIONER ASSY,AUTO (SPARE PARTS FOR HONDA AUTOMOBILES)ThailandDadri-ACPL CFSPCS13,1093,109
    ipjqwwyeSPARE PARTS AUTO PARTS (UNBRANDED / UNPOPULARBRAND)ChinaChennai SeaSET58258,892101
    hvrpnmuy79602T9CK41 SW ASSY,AUTO A/C (SPARE PARTS FOR HONDA AUTOMOBILES)ThailandDadri-ACPL CFSPCS216,7818,391
    blrtuhusPF4 POWERFOLD LEFT ELECT BASIC PART NO- 21982383 (AUTO SPARE PARTS) (4032 PCS)TurkeyNhava Sheva SeaKGS5441,195,4602,198
    wzjqdcboPF4 POWERFOLD RIGHT ELECT BASIC PART NO-21982384 (AUTO SPARE PARTS) (4032 PCS)TurkeyNhava Sheva SeaKGS5441,195,4602,198
    lehvkjbpWEAR INDICATOR (AUTO SPARE PARTS)GermanyKolkata SeaPCS1402,90821
    nmovngakSTRUT BOOT (AUTO SPARE PARTS)GermanyKolkata SeaPCS1536,35742
    xrikynwsFUEL FILTER (AUTO SPARE PARTS)GermanyKolkata SeaPCS122,393199
    eqcgyqfwPROPELLOR SHAFT SUPPORT(AUTO SPARE PARTS)GermanyKolkata SeaPCS285,584199
    kbxyosuuOIL PUMP (AUTO SPARE PARTS)GermanyKolkata SeaPCS82,493312
    ftbuyfhtRADIATOR HOSE (AUTO SPARE PARTS)GermanyKolkata SeaPCS859875
    pizqccdkINJECTOR NOZZIE ( AUTO SPARE PARTS )ChinaBombay Air CargoPCS1,28010,6058
    kqzfkqkfCONTROL VALVE ( AUTO SPARE PARTS )ChinaBombay Air CargoPCS5002,4175
    yxkwqxbvDOOR VISOR (AUTO SPARE PARTS & ACCESSORIES)ThailandNhava Sheva SeaSET205,185259
    zasrtmstBRAKE PAD(AUTO SPARE PARTS) PART NO.7800SpainKolkata SeaPCS307,972266
    akzxoytoBRAKE PAD(AUTO SPARE PARTS) PART NO.9500SpainKolkata SeaPCS8025,407318
    zfjajgzwBRAKE PAD(AUTO SPARE PARTS) PART NO.9600SpainKolkata SeaPCS7023,139331
    whlaxmvsBRAKE PAD(AUTO SPARE PARTS) PART NO.3100SpainKolkata SeaPCS106,741674
    ffuaucmbBRAKE PAD(AUTO SPARE PARTS) PART NO.4500SpainKolkata SeaPCS8026,444331